Mar 14 08:56:33 crc systemd[1]: Starting Kubernetes Kubelet... Mar 14 08:56:33 crc restorecon[4702]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:33 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:56:34 crc restorecon[4702]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 14 08:56:34 crc kubenswrapper[4956]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:56:34 crc kubenswrapper[4956]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 14 08:56:34 crc kubenswrapper[4956]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:56:34 crc kubenswrapper[4956]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:56:34 crc kubenswrapper[4956]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 14 08:56:34 crc kubenswrapper[4956]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.956241 4956 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962026 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962058 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962072 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962084 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962095 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962106 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962116 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962126 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962135 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962143 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962152 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962162 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962170 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962178 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962201 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962209 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962216 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962225 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962232 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962240 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962248 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962255 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962263 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962271 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962279 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962286 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962294 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962301 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962309 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962317 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962325 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962333 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962340 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962348 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962356 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962363 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962371 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962379 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962389 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962397 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962409 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962419 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962431 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962440 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962448 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962456 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962463 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962471 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962479 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962512 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962522 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962530 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962538 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962546 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962554 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962561 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962569 4956 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962577 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962584 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962592 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962600 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962607 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962614 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962622 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962630 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962822 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962830 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962902 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962914 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962923 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.962933 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964058 4956 flags.go:64] FLAG: --address="0.0.0.0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964087 4956 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964104 4956 flags.go:64] FLAG: --anonymous-auth="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964116 4956 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964128 4956 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964137 4956 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964149 4956 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964161 4956 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964171 4956 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964180 4956 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964190 4956 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964199 4956 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964208 4956 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964217 4956 flags.go:64] FLAG: --cgroup-root="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964227 4956 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964237 4956 flags.go:64] FLAG: --client-ca-file="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964246 4956 flags.go:64] FLAG: --cloud-config="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964254 4956 flags.go:64] FLAG: --cloud-provider="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964263 4956 flags.go:64] FLAG: --cluster-dns="[]" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964273 4956 flags.go:64] FLAG: --cluster-domain="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964282 4956 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964291 4956 flags.go:64] FLAG: --config-dir="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964300 4956 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964310 4956 flags.go:64] FLAG: --container-log-max-files="5" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964321 4956 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964331 4956 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964342 4956 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964351 4956 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964360 4956 flags.go:64] FLAG: --contention-profiling="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964369 4956 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964379 4956 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964388 4956 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964398 4956 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964409 4956 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964418 4956 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964428 4956 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964437 4956 flags.go:64] FLAG: --enable-load-reader="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964448 4956 flags.go:64] FLAG: --enable-server="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964456 4956 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964468 4956 flags.go:64] FLAG: --event-burst="100" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964478 4956 flags.go:64] FLAG: --event-qps="50" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964535 4956 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964545 4956 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964553 4956 flags.go:64] FLAG: --eviction-hard="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964565 4956 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964574 4956 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964585 4956 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964596 4956 flags.go:64] FLAG: --eviction-soft="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964606 4956 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964615 4956 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964624 4956 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964633 4956 flags.go:64] FLAG: --experimental-mounter-path="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964642 4956 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964651 4956 flags.go:64] FLAG: --fail-swap-on="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964659 4956 flags.go:64] FLAG: --feature-gates="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964670 4956 flags.go:64] FLAG: --file-check-frequency="20s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964679 4956 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964689 4956 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964698 4956 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964708 4956 flags.go:64] FLAG: --healthz-port="10248" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964717 4956 flags.go:64] FLAG: --help="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964726 4956 flags.go:64] FLAG: --hostname-override="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964735 4956 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964745 4956 flags.go:64] FLAG: --http-check-frequency="20s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964754 4956 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964762 4956 flags.go:64] FLAG: --image-credential-provider-config="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964771 4956 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964780 4956 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964790 4956 flags.go:64] FLAG: --image-service-endpoint="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964799 4956 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964807 4956 flags.go:64] FLAG: --kube-api-burst="100" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964816 4956 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964826 4956 flags.go:64] FLAG: --kube-api-qps="50" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964835 4956 flags.go:64] FLAG: --kube-reserved="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964844 4956 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964852 4956 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964861 4956 flags.go:64] FLAG: --kubelet-cgroups="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964870 4956 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964880 4956 flags.go:64] FLAG: --lock-file="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964889 4956 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964924 4956 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964934 4956 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964949 4956 flags.go:64] FLAG: --log-json-split-stream="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964958 4956 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964967 4956 flags.go:64] FLAG: --log-text-split-stream="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964976 4956 flags.go:64] FLAG: --logging-format="text" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964985 4956 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.964995 4956 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965004 4956 flags.go:64] FLAG: --manifest-url="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965013 4956 flags.go:64] FLAG: --manifest-url-header="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965025 4956 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965034 4956 flags.go:64] FLAG: --max-open-files="1000000" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965045 4956 flags.go:64] FLAG: --max-pods="110" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965054 4956 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965063 4956 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965072 4956 flags.go:64] FLAG: --memory-manager-policy="None" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965081 4956 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965090 4956 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965100 4956 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965109 4956 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965129 4956 flags.go:64] FLAG: --node-status-max-images="50" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965138 4956 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965147 4956 flags.go:64] FLAG: --oom-score-adj="-999" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965157 4956 flags.go:64] FLAG: --pod-cidr="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965167 4956 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965180 4956 flags.go:64] FLAG: --pod-manifest-path="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965189 4956 flags.go:64] FLAG: --pod-max-pids="-1" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965198 4956 flags.go:64] FLAG: --pods-per-core="0" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965207 4956 flags.go:64] FLAG: --port="10250" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965217 4956 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965225 4956 flags.go:64] FLAG: --provider-id="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965234 4956 flags.go:64] FLAG: --qos-reserved="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965243 4956 flags.go:64] FLAG: --read-only-port="10255" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965253 4956 flags.go:64] FLAG: --register-node="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965262 4956 flags.go:64] FLAG: --register-schedulable="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965270 4956 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965284 4956 flags.go:64] FLAG: --registry-burst="10" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965293 4956 flags.go:64] FLAG: --registry-qps="5" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965302 4956 flags.go:64] FLAG: --reserved-cpus="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965310 4956 flags.go:64] FLAG: --reserved-memory="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965321 4956 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965330 4956 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965340 4956 flags.go:64] FLAG: --rotate-certificates="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965349 4956 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965358 4956 flags.go:64] FLAG: --runonce="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965366 4956 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965375 4956 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965384 4956 flags.go:64] FLAG: --seccomp-default="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965393 4956 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965402 4956 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965411 4956 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965421 4956 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965430 4956 flags.go:64] FLAG: --storage-driver-password="root" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965439 4956 flags.go:64] FLAG: --storage-driver-secure="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965448 4956 flags.go:64] FLAG: --storage-driver-table="stats" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965682 4956 flags.go:64] FLAG: --storage-driver-user="root" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965693 4956 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965702 4956 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965711 4956 flags.go:64] FLAG: --system-cgroups="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965720 4956 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965738 4956 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965747 4956 flags.go:64] FLAG: --tls-cert-file="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965755 4956 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965766 4956 flags.go:64] FLAG: --tls-min-version="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965786 4956 flags.go:64] FLAG: --tls-private-key-file="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965795 4956 flags.go:64] FLAG: --topology-manager-policy="none" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965804 4956 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965813 4956 flags.go:64] FLAG: --topology-manager-scope="container" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965822 4956 flags.go:64] FLAG: --v="2" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965834 4956 flags.go:64] FLAG: --version="false" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965846 4956 flags.go:64] FLAG: --vmodule="" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965857 4956 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.965867 4956 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966087 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966096 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966106 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966115 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966125 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966133 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966141 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966148 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966156 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966164 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966172 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966179 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966187 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966198 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966208 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966216 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966227 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966237 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966246 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966255 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966264 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966273 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966281 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966294 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966304 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966314 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966325 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966334 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966343 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966352 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966360 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966368 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966376 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966384 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966393 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966400 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966408 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966416 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966423 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966431 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966439 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966447 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966454 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966462 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966470 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966478 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966509 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966517 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966525 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966534 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966542 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966549 4956 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966557 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966565 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966573 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966584 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966592 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966599 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966607 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966615 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966624 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966632 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966642 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966650 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966659 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966667 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966675 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966682 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966690 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966697 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.966705 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.967589 4956 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.978022 4956 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.978067 4956 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978159 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978170 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978177 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978182 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978189 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978194 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978199 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978205 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978212 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978219 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978225 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978231 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978236 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978241 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978246 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978252 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978257 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978262 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978270 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978276 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978282 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978287 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978293 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978298 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978303 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978308 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978313 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978318 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978323 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978328 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978333 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978340 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978345 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978351 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978358 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978363 4956 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978369 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978375 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978385 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978392 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978400 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978409 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978416 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978424 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978431 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978437 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978444 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978450 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978457 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978462 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978467 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978472 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978478 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978519 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978529 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978536 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978543 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978548 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978554 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978560 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978565 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978570 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978576 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978583 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978589 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978594 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978599 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978604 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978610 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978616 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978624 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.978635 4956 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978801 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978811 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978817 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978823 4956 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978828 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978833 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978838 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978844 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978850 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978855 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978861 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978866 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978871 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978876 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978883 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978891 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978896 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978901 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978907 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978954 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978959 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978964 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978969 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978975 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978981 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978986 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978991 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.978997 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979002 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979007 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979012 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979017 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979022 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979029 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979037 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979043 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979048 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979053 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979059 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979064 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979070 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979075 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979080 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979088 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979095 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979101 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979107 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979113 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979118 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979124 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979129 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979134 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979140 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979144 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979150 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979157 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979164 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979171 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979176 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979183 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979189 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979194 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979200 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979205 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979210 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979216 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979221 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979227 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979232 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979237 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:56:34 crc kubenswrapper[4956]: W0314 08:56:34.979244 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.979252 4956 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.980454 4956 server.go:940] "Client rotation is on, will bootstrap in background" Mar 14 08:56:34 crc kubenswrapper[4956]: E0314 08:56:34.984285 4956 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.993446 4956 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.993633 4956 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.996562 4956 server.go:997] "Starting client certificate rotation" Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.996615 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 14 08:56:34 crc kubenswrapper[4956]: I0314 08:56:34.996890 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.029687 4956 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.033294 4956 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.037199 4956 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.057626 4956 log.go:25] "Validated CRI v1 runtime API" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.092710 4956 log.go:25] "Validated CRI v1 image API" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.095232 4956 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.100775 4956 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-14-08-51-45-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.100808 4956 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.118825 4956 manager.go:217] Machine: {Timestamp:2026-03-14 08:56:35.116935194 +0000 UTC m=+0.629627502 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6518bcde-aa50-4603-92c7-71dcf31294f9 BootID:da1aafa5-6606-474c-a451-a259d5bddf37 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d2:a0:75 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d2:a0:75 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e3:c1:c5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3d:3e:99 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:27:fd:d9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6e:7d:ad Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:22:20:c3:06:8f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:32:0e:82:88:2d:7e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.119272 4956 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.119416 4956 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.120372 4956 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.120604 4956 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.120644 4956 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.120874 4956 topology_manager.go:138] "Creating topology manager with none policy" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.120887 4956 container_manager_linux.go:303] "Creating device plugin manager" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.122060 4956 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.122101 4956 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.122293 4956 state_mem.go:36] "Initialized new in-memory state store" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.122393 4956 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.125698 4956 kubelet.go:418] "Attempting to sync node with API server" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.125732 4956 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.125765 4956 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.125782 4956 kubelet.go:324] "Adding apiserver pod source" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.125796 4956 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.133796 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.133926 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.133976 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.134250 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.136627 4956 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.138497 4956 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.140089 4956 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141743 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141777 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141787 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141800 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141818 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141830 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141842 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141862 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141875 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141885 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141899 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.141909 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.142805 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.143387 4956 server.go:1280] "Started kubelet" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.143637 4956 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.144379 4956 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.145001 4956 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.145247 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:35 crc systemd[1]: Started Kubernetes Kubelet. Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.146944 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.146982 4956 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.147462 4956 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.147504 4956 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.147545 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.147633 4956 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.148399 4956 server.go:460] "Adding debug handlers to kubelet server" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.148447 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.148571 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149237 4956 factory.go:55] Registering systemd factory Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149263 4956 factory.go:221] Registration of the systemd container factory successfully Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.149424 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149593 4956 factory.go:153] Registering CRI-O factory Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149606 4956 factory.go:221] Registration of the crio container factory successfully Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149678 4956 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149705 4956 factory.go:103] Registering Raw factory Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.149721 4956 manager.go:1196] Started watching for new ooms in manager Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.150333 4956 manager.go:319] Starting recovery of all containers Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.162688 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169205 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169355 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169385 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169413 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169438 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169466 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169547 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169577 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169614 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169644 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169674 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169699 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169727 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169761 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169793 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169815 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169839 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169862 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169885 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169914 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169939 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169963 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.169987 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170011 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170038 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170067 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170102 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170171 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170197 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170224 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170253 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170278 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170305 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170332 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170359 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170467 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170530 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170558 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170587 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170612 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170639 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170666 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170695 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170723 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170747 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170771 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170809 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170872 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170900 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170927 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170950 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.170976 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171009 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171040 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171068 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171098 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171125 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171150 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171179 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171206 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171234 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171258 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171286 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171344 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171371 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171397 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171423 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171448 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171474 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171527 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171555 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171581 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171612 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171639 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171664 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171691 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171718 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171746 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171776 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171806 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171836 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171862 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171888 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171914 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171940 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171967 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.171993 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172021 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172048 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172073 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172099 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172125 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172154 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172184 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172213 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172238 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172266 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172296 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172322 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172349 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172376 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172405 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172433 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172457 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172532 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172566 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172591 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172618 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172643 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172672 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172700 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172732 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172760 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172788 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172817 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172845 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172870 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172895 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172923 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172949 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.172976 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173002 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173026 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173052 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173080 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173111 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173141 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173166 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173189 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173215 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173242 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173270 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173298 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173327 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173357 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173386 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173416 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173442 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173470 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173560 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173596 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173635 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173668 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173693 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173721 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173745 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173770 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173796 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173818 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173844 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173867 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173895 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173924 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173951 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.173974 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174000 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174030 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174058 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174086 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174111 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174132 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174151 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174170 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174188 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174207 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174239 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174258 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174277 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174296 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174315 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174333 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174388 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174410 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174438 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174468 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174532 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174557 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174585 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174612 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174636 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174659 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174684 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174709 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174734 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174764 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174791 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174815 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174840 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174872 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174915 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174948 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174966 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.174995 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.175024 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.175050 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.175075 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.175100 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178589 4956 manager.go:324] Recovery completed Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178689 4956 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178739 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178755 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178766 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178780 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178792 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178804 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178822 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178835 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178855 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178866 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178875 4956 reconstruct.go:97] "Volume reconstruction finished" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.178883 4956 reconciler.go:26] "Reconciler: start to sync state" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.196364 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.197959 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.198023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.198040 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.199156 4956 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.199176 4956 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.199239 4956 state_mem.go:36] "Initialized new in-memory state store" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.204732 4956 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.207968 4956 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.208033 4956 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.208072 4956 kubelet.go:2335] "Starting kubelet main sync loop" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.208130 4956 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.209238 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.209322 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.223170 4956 policy_none.go:49] "None policy: Start" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.224719 4956 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.224747 4956 state_mem.go:35] "Initializing new in-memory state store" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.248593 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.293149 4956 manager.go:334] "Starting Device Plugin manager" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.293226 4956 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.293247 4956 server.go:79] "Starting device plugin registration server" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.294229 4956 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.294258 4956 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.294751 4956 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.295010 4956 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.295027 4956 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.301547 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.308795 4956 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.308936 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.310351 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.310408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.310432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.310683 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.311023 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.311096 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.311765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.311804 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.311816 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.312048 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.312306 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.312426 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.312873 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.312928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.312948 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318583 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.318981 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.319028 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.319741 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.319818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.319844 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.320166 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.320281 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.320312 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.320765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.320813 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.320825 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.321357 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.321541 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.321646 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.322002 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.322038 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.322050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.322279 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.322328 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.323688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.323727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.323743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.350303 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381467 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381572 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381624 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381675 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381737 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381822 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381867 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381887 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381904 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381921 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381942 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381961 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.381976 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.382028 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.382072 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.394814 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.396640 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.396719 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.396764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.396797 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.397546 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483012 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483105 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483184 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483227 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483263 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483330 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483365 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483383 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483383 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483418 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483748 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483785 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483817 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483847 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483880 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483946 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483573 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483544 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483533 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.484355 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.484381 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483577 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.484396 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.484429 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.484421 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.483535 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.484866 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.598178 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.600338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.600396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.600415 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.600452 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.601081 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.661625 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.690963 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.708199 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.716614 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3e4b8c342aa68590cc6ede5aa13a4fe8e50099c6241f09f22916a6b0bd4d0a44 WatchSource:0}: Error finding container 3e4b8c342aa68590cc6ede5aa13a4fe8e50099c6241f09f22916a6b0bd4d0a44: Status 404 returned error can't find the container with id 3e4b8c342aa68590cc6ede5aa13a4fe8e50099c6241f09f22916a6b0bd4d0a44 Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.727354 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.734925 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7a43a739d709bc7358501caecc093d82687b8a7d6d10bdbf2c0d52ce60a09e6d WatchSource:0}: Error finding container 7a43a739d709bc7358501caecc093d82687b8a7d6d10bdbf2c0d52ce60a09e6d: Status 404 returned error can't find the container with id 7a43a739d709bc7358501caecc093d82687b8a7d6d10bdbf2c0d52ce60a09e6d Mar 14 08:56:35 crc kubenswrapper[4956]: I0314 08:56:35.742697 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.750191 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-009b4a00413dac4a19cba7b2eede1965121663ca4e140a85583e3c8dcf6c1d22 WatchSource:0}: Error finding container 009b4a00413dac4a19cba7b2eede1965121663ca4e140a85583e3c8dcf6c1d22: Status 404 returned error can't find the container with id 009b4a00413dac4a19cba7b2eede1965121663ca4e140a85583e3c8dcf6c1d22 Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.751930 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.766398 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ef4c242927e715c7ce4b29f670a4c34ea8454f297cd789e569cf2bb50ca12017 WatchSource:0}: Error finding container ef4c242927e715c7ce4b29f670a4c34ea8454f297cd789e569cf2bb50ca12017: Status 404 returned error can't find the container with id ef4c242927e715c7ce4b29f670a4c34ea8454f297cd789e569cf2bb50ca12017 Mar 14 08:56:35 crc kubenswrapper[4956]: W0314 08:56:35.996831 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:35 crc kubenswrapper[4956]: E0314 08:56:35.996939 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.001221 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.002860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.002892 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.002902 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.002938 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:36 crc kubenswrapper[4956]: E0314 08:56:36.003391 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.146687 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:36 crc kubenswrapper[4956]: W0314 08:56:36.206853 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:36 crc kubenswrapper[4956]: E0314 08:56:36.206954 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.213516 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e4b8c342aa68590cc6ede5aa13a4fe8e50099c6241f09f22916a6b0bd4d0a44"} Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.215816 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ef4c242927e715c7ce4b29f670a4c34ea8454f297cd789e569cf2bb50ca12017"} Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.217000 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"009b4a00413dac4a19cba7b2eede1965121663ca4e140a85583e3c8dcf6c1d22"} Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.218784 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7a43a739d709bc7358501caecc093d82687b8a7d6d10bdbf2c0d52ce60a09e6d"} Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.219746 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"da6cd84164e65cf845639180e2b812b7f92ed8ea68e6f3be921b099f7798824f"} Mar 14 08:56:36 crc kubenswrapper[4956]: W0314 08:56:36.272940 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:36 crc kubenswrapper[4956]: E0314 08:56:36.273088 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:36 crc kubenswrapper[4956]: E0314 08:56:36.553384 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Mar 14 08:56:36 crc kubenswrapper[4956]: W0314 08:56:36.703283 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:36 crc kubenswrapper[4956]: E0314 08:56:36.703361 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.803751 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.805675 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.805723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.805735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:36 crc kubenswrapper[4956]: I0314 08:56:36.805766 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:36 crc kubenswrapper[4956]: E0314 08:56:36.807052 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.145938 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.226541 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0e700bc758b969fb98409b212464daee2b6af2801b09942fb2dc37ca56a5ba77"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.226644 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d07fcb9bc726f1834ca31b99f2906ceb517673b36e1b1527e2e157dc8540cf81"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.226672 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"52f963cc29f2d8c086ca6286cfe2cc00d0c427910d03234fca73e6240c9bdcc0"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.228894 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b" exitCode=0 Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.229015 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.229089 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.230678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.230748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.230776 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.231350 4956 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300" exitCode=0 Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.231550 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.231645 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.233447 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.233495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.233510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.234426 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.234451 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.235659 4956 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341" exitCode=0 Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.235731 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.235894 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:37 crc kubenswrapper[4956]: E0314 08:56:37.237665 4956 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.240025 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.240077 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.240109 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.240130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.240141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.240169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.244211 4956 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993" exitCode=0 Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.244281 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993"} Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.244547 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.246414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.246463 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:37 crc kubenswrapper[4956]: I0314 08:56:37.246495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:38 crc kubenswrapper[4956]: W0314 08:56:38.061540 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:38 crc kubenswrapper[4956]: E0314 08:56:38.061644 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.146911 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:38 crc kubenswrapper[4956]: E0314 08:56:38.154751 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.252825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.252873 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.252885 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.252968 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.253738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.253763 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.253773 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.256471 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5383fab4a9d65b713dd5b265be1c13383e501e556a66b3cec68048b47f7eeaec"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.256601 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.257367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.257396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.257408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.259816 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.259846 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.259861 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.259873 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.264186 4956 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534" exitCode=0 Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.268918 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.268930 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.270112 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.270163 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.270179 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.270745 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b214fbcc640386ff8fd014404394f14a77bd9a8570ac1533357e7112f8ca4fa6"} Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.270808 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.273222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.273251 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.273261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.407355 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.408679 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.408723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.408738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:38 crc kubenswrapper[4956]: I0314 08:56:38.408761 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:38 crc kubenswrapper[4956]: E0314 08:56:38.409229 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 14 08:56:38 crc kubenswrapper[4956]: W0314 08:56:38.973615 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:38 crc kubenswrapper[4956]: E0314 08:56:38.973741 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:39 crc kubenswrapper[4956]: W0314 08:56:39.011502 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:39 crc kubenswrapper[4956]: E0314 08:56:39.011602 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:39 crc kubenswrapper[4956]: W0314 08:56:39.015385 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:39 crc kubenswrapper[4956]: E0314 08:56:39.015440 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.124015 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.146528 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.276241 4956 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c" exitCode=0 Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.276343 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c"} Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.276549 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.277816 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.277883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.277899 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.282130 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3ef9f0749ee0ae2f5e247fac558a67526532794c60a0f36e1a0bd2446d0682f"} Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.282187 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.282209 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.282236 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.282287 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.282157 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283444 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283492 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283736 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283760 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.283991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.284004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.284012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.284496 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.284512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.284521 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.368865 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:39 crc kubenswrapper[4956]: I0314 08:56:39.377007 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:39 crc kubenswrapper[4956]: E0314 08:56:39.456228 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.294192 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f"} Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.294263 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935"} Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.294282 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8"} Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.296827 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.299227 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3ef9f0749ee0ae2f5e247fac558a67526532794c60a0f36e1a0bd2446d0682f" exitCode=255 Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.299307 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f3ef9f0749ee0ae2f5e247fac558a67526532794c60a0f36e1a0bd2446d0682f"} Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.299403 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.299404 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.300581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.300618 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.300632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.300743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.300782 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.300802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:40 crc kubenswrapper[4956]: I0314 08:56:40.301452 4956 scope.go:117] "RemoveContainer" containerID="f3ef9f0749ee0ae2f5e247fac558a67526532794c60a0f36e1a0bd2446d0682f" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.307298 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa"} Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.307354 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e"} Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.307436 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.308458 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.308518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.308533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.310626 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.312907 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.312941 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.312993 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.312946 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.312915 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7"} Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.313977 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.314037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.314059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.313984 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.314113 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.314124 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.352038 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.609932 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.612019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.612108 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.612126 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.612176 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:41 crc kubenswrapper[4956]: I0314 08:56:41.661937 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.125169 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.125651 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.316299 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.316386 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.318382 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.318467 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.318530 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.319069 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.319099 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:42 crc kubenswrapper[4956]: I0314 08:56:42.319114 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.018134 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.319203 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.320704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.320743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.320755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.462239 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.462477 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.462586 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.464261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.464295 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.464306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:43 crc kubenswrapper[4956]: I0314 08:56:43.916041 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.322039 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.323431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.323519 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.323540 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.580005 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.580326 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.582147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.582204 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:44 crc kubenswrapper[4956]: I0314 08:56:44.582214 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:45 crc kubenswrapper[4956]: E0314 08:56:45.301837 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.325126 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.326529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.326582 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.326600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.410686 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.411030 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.413353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.413422 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.413446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.456213 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.456567 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.458664 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.458737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:45 crc kubenswrapper[4956]: I0314 08:56:45.458760 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:47 crc kubenswrapper[4956]: I0314 08:56:47.965022 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 14 08:56:47 crc kubenswrapper[4956]: I0314 08:56:47.965427 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:47 crc kubenswrapper[4956]: I0314 08:56:47.966848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:47 crc kubenswrapper[4956]: I0314 08:56:47.966888 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:47 crc kubenswrapper[4956]: I0314 08:56:47.966900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:50 crc kubenswrapper[4956]: I0314 08:56:50.147360 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 14 08:56:50 crc kubenswrapper[4956]: W0314 08:56:50.433205 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.433304 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:50 crc kubenswrapper[4956]: I0314 08:56:50.440019 4956 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:56:50 crc kubenswrapper[4956]: I0314 08:56:50.440095 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 08:56:50 crc kubenswrapper[4956]: W0314 08:56:50.444745 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.444833 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.446375 4956 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:50 crc kubenswrapper[4956]: I0314 08:56:50.446589 4956 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:56:50 crc kubenswrapper[4956]: I0314 08:56:50.446636 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 08:56:50 crc kubenswrapper[4956]: W0314 08:56:50.450455 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.450562 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:50 crc kubenswrapper[4956]: W0314 08:56:50.450617 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.450658 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.450883 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.454170 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:56:50 crc kubenswrapper[4956]: E0314 08:56:50.454333 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.154817 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:51Z is after 2026-02-23T05:33:13Z Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.342983 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.343610 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.345404 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" exitCode=255 Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.345450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7"} Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.345549 4956 scope.go:117] "RemoveContainer" containerID="f3ef9f0749ee0ae2f5e247fac558a67526532794c60a0f36e1a0bd2446d0682f" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.345734 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.346609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.346645 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.346656 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:51 crc kubenswrapper[4956]: I0314 08:56:51.347182 4956 scope.go:117] "RemoveContainer" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" Mar 14 08:56:51 crc kubenswrapper[4956]: E0314 08:56:51.347369 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:56:52 crc kubenswrapper[4956]: I0314 08:56:52.125994 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:56:52 crc kubenswrapper[4956]: I0314 08:56:52.126143 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:56:52 crc kubenswrapper[4956]: I0314 08:56:52.153201 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:52Z is after 2026-02-23T05:33:13Z Mar 14 08:56:52 crc kubenswrapper[4956]: I0314 08:56:52.352116 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.026654 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.026994 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.028583 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.028655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.028680 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.029623 4956 scope.go:117] "RemoveContainer" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" Mar 14 08:56:53 crc kubenswrapper[4956]: E0314 08:56:53.029922 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.034648 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.150921 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:53Z is after 2026-02-23T05:33:13Z Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.358085 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.359104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.359144 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.359155 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:53 crc kubenswrapper[4956]: I0314 08:56:53.359645 4956 scope.go:117] "RemoveContainer" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" Mar 14 08:56:53 crc kubenswrapper[4956]: E0314 08:56:53.359804 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.149660 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:54Z is after 2026-02-23T05:33:13Z Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.462731 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.463010 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.464402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.464454 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.464464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:54 crc kubenswrapper[4956]: I0314 08:56:54.465080 4956 scope.go:117] "RemoveContainer" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" Mar 14 08:56:54 crc kubenswrapper[4956]: E0314 08:56:54.465258 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:56:55 crc kubenswrapper[4956]: I0314 08:56:55.148715 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:55Z is after 2026-02-23T05:33:13Z Mar 14 08:56:55 crc kubenswrapper[4956]: E0314 08:56:55.302036 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:56:55 crc kubenswrapper[4956]: I0314 08:56:55.461402 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:56:55 crc kubenswrapper[4956]: I0314 08:56:55.461561 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:55 crc kubenswrapper[4956]: I0314 08:56:55.462803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:55 crc kubenswrapper[4956]: I0314 08:56:55.462830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:55 crc kubenswrapper[4956]: I0314 08:56:55.462838 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:56 crc kubenswrapper[4956]: I0314 08:56:56.150101 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:56Z is after 2026-02-23T05:33:13Z Mar 14 08:56:56 crc kubenswrapper[4956]: I0314 08:56:56.854721 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:56 crc kubenswrapper[4956]: E0314 08:56:56.854804 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 08:56:56 crc kubenswrapper[4956]: I0314 08:56:56.855912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:56 crc kubenswrapper[4956]: I0314 08:56:56.855943 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:56 crc kubenswrapper[4956]: I0314 08:56:56.855953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:56 crc kubenswrapper[4956]: I0314 08:56:56.855974 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:56:56 crc kubenswrapper[4956]: E0314 08:56:56.859575 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:56:57 crc kubenswrapper[4956]: I0314 08:56:57.151539 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:57Z is after 2026-02-23T05:33:13Z Mar 14 08:56:57 crc kubenswrapper[4956]: W0314 08:56:57.230925 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:57Z is after 2026-02-23T05:33:13Z Mar 14 08:56:57 crc kubenswrapper[4956]: E0314 08:56:57.231035 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:57 crc kubenswrapper[4956]: I0314 08:56:57.995446 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 14 08:56:57 crc kubenswrapper[4956]: I0314 08:56:57.995849 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:57 crc kubenswrapper[4956]: I0314 08:56:57.997385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:57 crc kubenswrapper[4956]: I0314 08:56:57.997436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:57 crc kubenswrapper[4956]: I0314 08:56:57.997455 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:58 crc kubenswrapper[4956]: I0314 08:56:58.016552 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 14 08:56:58 crc kubenswrapper[4956]: I0314 08:56:58.151926 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:58Z is after 2026-02-23T05:33:13Z Mar 14 08:56:58 crc kubenswrapper[4956]: I0314 08:56:58.371695 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:56:58 crc kubenswrapper[4956]: I0314 08:56:58.373363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:56:58 crc kubenswrapper[4956]: I0314 08:56:58.373423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:56:58 crc kubenswrapper[4956]: I0314 08:56:58.373435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:56:58 crc kubenswrapper[4956]: W0314 08:56:58.789746 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:58Z is after 2026-02-23T05:33:13Z Mar 14 08:56:58 crc kubenswrapper[4956]: E0314 08:56:58.789830 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:59 crc kubenswrapper[4956]: I0314 08:56:59.149790 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:59Z is after 2026-02-23T05:33:13Z Mar 14 08:56:59 crc kubenswrapper[4956]: I0314 08:56:59.181376 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:56:59 crc kubenswrapper[4956]: E0314 08:56:59.187208 4956 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:56:59 crc kubenswrapper[4956]: W0314 08:56:59.353768 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:59Z is after 2026-02-23T05:33:13Z Mar 14 08:56:59 crc kubenswrapper[4956]: E0314 08:56:59.353871 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:56:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:00 crc kubenswrapper[4956]: I0314 08:57:00.151104 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:00Z is after 2026-02-23T05:33:13Z Mar 14 08:57:00 crc kubenswrapper[4956]: W0314 08:57:00.387379 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:00Z is after 2026-02-23T05:33:13Z Mar 14 08:57:00 crc kubenswrapper[4956]: E0314 08:57:00.387511 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:00 crc kubenswrapper[4956]: E0314 08:57:00.460726 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:01 crc kubenswrapper[4956]: I0314 08:57:01.150908 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:01Z is after 2026-02-23T05:33:13Z Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.125773 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.125863 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.125933 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.126159 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.128291 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.128320 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.128333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.128856 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d07fcb9bc726f1834ca31b99f2906ceb517673b36e1b1527e2e157dc8540cf81"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.129048 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d07fcb9bc726f1834ca31b99f2906ceb517673b36e1b1527e2e157dc8540cf81" gracePeriod=30 Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.152064 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:02Z is after 2026-02-23T05:33:13Z Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.384115 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.384539 4956 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d07fcb9bc726f1834ca31b99f2906ceb517673b36e1b1527e2e157dc8540cf81" exitCode=255 Mar 14 08:57:02 crc kubenswrapper[4956]: I0314 08:57:02.384591 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d07fcb9bc726f1834ca31b99f2906ceb517673b36e1b1527e2e157dc8540cf81"} Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.149883 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:03Z is after 2026-02-23T05:33:13Z Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.391077 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.392322 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7fac7c1aebf7e2e76475ec534d7f6d29771289f555a065925548213ae265250a"} Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.392526 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.394337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.394379 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.394395 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.462784 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.859705 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:03 crc kubenswrapper[4956]: E0314 08:57:03.859710 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.863927 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.863976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.863991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:03 crc kubenswrapper[4956]: I0314 08:57:03.864019 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:03 crc kubenswrapper[4956]: E0314 08:57:03.867744 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:57:04 crc kubenswrapper[4956]: I0314 08:57:04.151809 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:04Z is after 2026-02-23T05:33:13Z Mar 14 08:57:04 crc kubenswrapper[4956]: I0314 08:57:04.395925 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:04 crc kubenswrapper[4956]: I0314 08:57:04.397409 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:04 crc kubenswrapper[4956]: I0314 08:57:04.397518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:04 crc kubenswrapper[4956]: I0314 08:57:04.397548 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.150887 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:05Z is after 2026-02-23T05:33:13Z Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.208820 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.210664 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.210758 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.210784 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.212017 4956 scope.go:117] "RemoveContainer" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" Mar 14 08:57:05 crc kubenswrapper[4956]: E0314 08:57:05.303875 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.398017 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.399175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.399211 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:05 crc kubenswrapper[4956]: I0314 08:57:05.399221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.151113 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:06Z is after 2026-02-23T05:33:13Z Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.403126 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.405506 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851"} Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.406195 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.407523 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.407647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:06 crc kubenswrapper[4956]: I0314 08:57:06.407735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.150074 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:07Z is after 2026-02-23T05:33:13Z Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.411312 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.413019 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.415100 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" exitCode=255 Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.415155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851"} Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.415194 4956 scope.go:117] "RemoveContainer" containerID="dc3e9eb735abef64dd9218b8f5e7149e78f1eaf253b4f26f4675a0c32b9832f7" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.415362 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.416375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.416412 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.416427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:07 crc kubenswrapper[4956]: I0314 08:57:07.417053 4956 scope.go:117] "RemoveContainer" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" Mar 14 08:57:07 crc kubenswrapper[4956]: E0314 08:57:07.417274 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:08 crc kubenswrapper[4956]: I0314 08:57:08.151519 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:08Z is after 2026-02-23T05:33:13Z Mar 14 08:57:08 crc kubenswrapper[4956]: I0314 08:57:08.422260 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:57:09 crc kubenswrapper[4956]: I0314 08:57:09.124728 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:09 crc kubenswrapper[4956]: I0314 08:57:09.125195 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:09 crc kubenswrapper[4956]: I0314 08:57:09.126926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:09 crc kubenswrapper[4956]: I0314 08:57:09.126988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:09 crc kubenswrapper[4956]: I0314 08:57:09.127008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:09 crc kubenswrapper[4956]: I0314 08:57:09.151633 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:09Z is after 2026-02-23T05:33:13Z Mar 14 08:57:10 crc kubenswrapper[4956]: I0314 08:57:10.151961 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:10Z is after 2026-02-23T05:33:13Z Mar 14 08:57:10 crc kubenswrapper[4956]: E0314 08:57:10.467711 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:10 crc kubenswrapper[4956]: E0314 08:57:10.866399 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 08:57:10 crc kubenswrapper[4956]: I0314 08:57:10.868546 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:10 crc kubenswrapper[4956]: I0314 08:57:10.870409 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:10 crc kubenswrapper[4956]: I0314 08:57:10.870677 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:10 crc kubenswrapper[4956]: I0314 08:57:10.870888 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:10 crc kubenswrapper[4956]: I0314 08:57:10.871097 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:10 crc kubenswrapper[4956]: E0314 08:57:10.876561 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.151933 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:11Z is after 2026-02-23T05:33:13Z Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.662001 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.662306 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.664234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.664301 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.664321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:11 crc kubenswrapper[4956]: I0314 08:57:11.665677 4956 scope.go:117] "RemoveContainer" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" Mar 14 08:57:11 crc kubenswrapper[4956]: E0314 08:57:11.666117 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:12 crc kubenswrapper[4956]: I0314 08:57:12.125048 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:57:12 crc kubenswrapper[4956]: I0314 08:57:12.125650 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:57:12 crc kubenswrapper[4956]: I0314 08:57:12.149347 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:12Z is after 2026-02-23T05:33:13Z Mar 14 08:57:13 crc kubenswrapper[4956]: I0314 08:57:13.151020 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:13Z is after 2026-02-23T05:33:13Z Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.151099 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:14Z is after 2026-02-23T05:33:13Z Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.462540 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.462751 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.464029 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.464070 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.464078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:14 crc kubenswrapper[4956]: I0314 08:57:14.464651 4956 scope.go:117] "RemoveContainer" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" Mar 14 08:57:14 crc kubenswrapper[4956]: E0314 08:57:14.464805 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:14 crc kubenswrapper[4956]: W0314 08:57:14.470328 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:14Z is after 2026-02-23T05:33:13Z Mar 14 08:57:14 crc kubenswrapper[4956]: E0314 08:57:14.470381 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:15 crc kubenswrapper[4956]: I0314 08:57:15.153168 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:15Z is after 2026-02-23T05:33:13Z Mar 14 08:57:15 crc kubenswrapper[4956]: E0314 08:57:15.304372 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:57:16 crc kubenswrapper[4956]: I0314 08:57:16.149076 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:16Z is after 2026-02-23T05:33:13Z Mar 14 08:57:16 crc kubenswrapper[4956]: I0314 08:57:16.395282 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:57:16 crc kubenswrapper[4956]: E0314 08:57:16.402155 4956 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:16 crc kubenswrapper[4956]: E0314 08:57:16.404173 4956 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 14 08:57:17 crc kubenswrapper[4956]: I0314 08:57:17.150571 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:17Z is after 2026-02-23T05:33:13Z Mar 14 08:57:17 crc kubenswrapper[4956]: W0314 08:57:17.808357 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:17Z is after 2026-02-23T05:33:13Z Mar 14 08:57:17 crc kubenswrapper[4956]: E0314 08:57:17.808475 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:17 crc kubenswrapper[4956]: E0314 08:57:17.872151 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 08:57:17 crc kubenswrapper[4956]: I0314 08:57:17.877436 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:17 crc kubenswrapper[4956]: I0314 08:57:17.879636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:17 crc kubenswrapper[4956]: I0314 08:57:17.879698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:17 crc kubenswrapper[4956]: I0314 08:57:17.879719 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:17 crc kubenswrapper[4956]: I0314 08:57:17.879762 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:17 crc kubenswrapper[4956]: E0314 08:57:17.882995 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:57:18 crc kubenswrapper[4956]: I0314 08:57:18.151089 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:18Z is after 2026-02-23T05:33:13Z Mar 14 08:57:18 crc kubenswrapper[4956]: W0314 08:57:18.554315 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:18Z is after 2026-02-23T05:33:13Z Mar 14 08:57:18 crc kubenswrapper[4956]: E0314 08:57:18.554517 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:19 crc kubenswrapper[4956]: I0314 08:57:19.151665 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:19Z is after 2026-02-23T05:33:13Z Mar 14 08:57:20 crc kubenswrapper[4956]: I0314 08:57:20.150017 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:20Z is after 2026-02-23T05:33:13Z Mar 14 08:57:20 crc kubenswrapper[4956]: E0314 08:57:20.473542 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:21 crc kubenswrapper[4956]: I0314 08:57:21.151948 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:21Z is after 2026-02-23T05:33:13Z Mar 14 08:57:22 crc kubenswrapper[4956]: I0314 08:57:22.125791 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:57:22 crc kubenswrapper[4956]: I0314 08:57:22.125871 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:57:22 crc kubenswrapper[4956]: I0314 08:57:22.152159 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:22Z is after 2026-02-23T05:33:13Z Mar 14 08:57:23 crc kubenswrapper[4956]: I0314 08:57:23.150415 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:23Z is after 2026-02-23T05:33:13Z Mar 14 08:57:23 crc kubenswrapper[4956]: W0314 08:57:23.152663 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:23Z is after 2026-02-23T05:33:13Z Mar 14 08:57:23 crc kubenswrapper[4956]: E0314 08:57:23.152760 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:57:24 crc kubenswrapper[4956]: I0314 08:57:24.151175 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:24Z is after 2026-02-23T05:33:13Z Mar 14 08:57:24 crc kubenswrapper[4956]: E0314 08:57:24.877521 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:24Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 08:57:24 crc kubenswrapper[4956]: I0314 08:57:24.883731 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:24 crc kubenswrapper[4956]: I0314 08:57:24.885257 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:24 crc kubenswrapper[4956]: I0314 08:57:24.885322 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:24 crc kubenswrapper[4956]: I0314 08:57:24.885347 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:24 crc kubenswrapper[4956]: I0314 08:57:24.885388 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:24 crc kubenswrapper[4956]: E0314 08:57:24.888224 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:57:25 crc kubenswrapper[4956]: I0314 08:57:25.150868 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:25Z is after 2026-02-23T05:33:13Z Mar 14 08:57:25 crc kubenswrapper[4956]: E0314 08:57:25.304976 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:57:25 crc kubenswrapper[4956]: I0314 08:57:25.416447 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:57:25 crc kubenswrapper[4956]: I0314 08:57:25.416775 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:25 crc kubenswrapper[4956]: I0314 08:57:25.418296 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:25 crc kubenswrapper[4956]: I0314 08:57:25.418386 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:25 crc kubenswrapper[4956]: I0314 08:57:25.418410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:26 crc kubenswrapper[4956]: I0314 08:57:26.151390 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:26Z is after 2026-02-23T05:33:13Z Mar 14 08:57:26 crc kubenswrapper[4956]: I0314 08:57:26.209423 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:26 crc kubenswrapper[4956]: I0314 08:57:26.211274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:26 crc kubenswrapper[4956]: I0314 08:57:26.211334 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:26 crc kubenswrapper[4956]: I0314 08:57:26.211355 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:26 crc kubenswrapper[4956]: I0314 08:57:26.212259 4956 scope.go:117] "RemoveContainer" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" Mar 14 08:57:26 crc kubenswrapper[4956]: E0314 08:57:26.212579 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:27 crc kubenswrapper[4956]: I0314 08:57:27.151654 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:27Z is after 2026-02-23T05:33:13Z Mar 14 08:57:28 crc kubenswrapper[4956]: I0314 08:57:28.152810 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:28Z is after 2026-02-23T05:33:13Z Mar 14 08:57:29 crc kubenswrapper[4956]: I0314 08:57:29.150673 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:29Z is after 2026-02-23T05:33:13Z Mar 14 08:57:30 crc kubenswrapper[4956]: I0314 08:57:30.148368 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:30Z is after 2026-02-23T05:33:13Z Mar 14 08:57:30 crc kubenswrapper[4956]: E0314 08:57:30.479345 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:31 crc kubenswrapper[4956]: I0314 08:57:31.149842 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:31Z is after 2026-02-23T05:33:13Z Mar 14 08:57:31 crc kubenswrapper[4956]: E0314 08:57:31.883207 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 08:57:31 crc kubenswrapper[4956]: I0314 08:57:31.888641 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:31 crc kubenswrapper[4956]: I0314 08:57:31.890419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:31 crc kubenswrapper[4956]: I0314 08:57:31.890471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:31 crc kubenswrapper[4956]: I0314 08:57:31.890519 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:31 crc kubenswrapper[4956]: I0314 08:57:31.890562 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:31 crc kubenswrapper[4956]: E0314 08:57:31.895811 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.125355 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.125601 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.125691 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.125882 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.127695 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.127746 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.127765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.128467 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"7fac7c1aebf7e2e76475ec534d7f6d29771289f555a065925548213ae265250a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.128661 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://7fac7c1aebf7e2e76475ec534d7f6d29771289f555a065925548213ae265250a" gracePeriod=30 Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.149826 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:32Z is after 2026-02-23T05:33:13Z Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.494010 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.495181 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.495816 4956 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7fac7c1aebf7e2e76475ec534d7f6d29771289f555a065925548213ae265250a" exitCode=255 Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.495860 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7fac7c1aebf7e2e76475ec534d7f6d29771289f555a065925548213ae265250a"} Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.495975 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb006cf48ec5280d10d2329b0f34877d6f22d340a70aefcf3b1098eca46b8ed3"} Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.496082 4956 scope.go:117] "RemoveContainer" containerID="d07fcb9bc726f1834ca31b99f2906ceb517673b36e1b1527e2e157dc8540cf81" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.496103 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.497429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.497460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:32 crc kubenswrapper[4956]: I0314 08:57:32.497470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.149172 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:33Z is after 2026-02-23T05:33:13Z Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.462798 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.502462 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.504008 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.504958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.504990 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:33 crc kubenswrapper[4956]: I0314 08:57:33.505004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:34 crc kubenswrapper[4956]: I0314 08:57:34.151209 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:34Z is after 2026-02-23T05:33:13Z Mar 14 08:57:34 crc kubenswrapper[4956]: I0314 08:57:34.507589 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:34 crc kubenswrapper[4956]: I0314 08:57:34.509101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:34 crc kubenswrapper[4956]: I0314 08:57:34.509202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:34 crc kubenswrapper[4956]: I0314 08:57:34.509291 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:35 crc kubenswrapper[4956]: I0314 08:57:35.149627 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:57:35Z is after 2026-02-23T05:33:13Z Mar 14 08:57:35 crc kubenswrapper[4956]: E0314 08:57:35.305686 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:57:36 crc kubenswrapper[4956]: I0314 08:57:36.156766 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:37 crc kubenswrapper[4956]: I0314 08:57:37.152194 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.154542 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.208966 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.211090 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.211155 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.211182 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.212559 4956 scope.go:117] "RemoveContainer" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" Mar 14 08:57:38 crc kubenswrapper[4956]: E0314 08:57:38.892903 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.896930 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.899530 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.899585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.899598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:38 crc kubenswrapper[4956]: I0314 08:57:38.899639 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:38 crc kubenswrapper[4956]: E0314 08:57:38.907922 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.124764 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.125005 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.126436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.126493 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.126505 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.153992 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.839979 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.842358 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa"} Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.842617 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.843752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.843797 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:39 crc kubenswrapper[4956]: I0314 08:57:39.843810 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.152160 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.490286 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d04735fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,LastTimestamp:2026-03-14 08:56:35.143349757 +0000 UTC m=+0.656042035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.499806 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.512706 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.518599 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.523838 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d983d02f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.298316335 +0000 UTC m=+0.811008623,LastTimestamp:2026-03-14 08:56:35.298316335 +0000 UTC m=+0.811008623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.530775 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.310386998 +0000 UTC m=+0.823079286,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.537714 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.310422429 +0000 UTC m=+0.823114707,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.542680 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389dbc1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.31044135 +0000 UTC m=+0.823133628,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.548050 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.311791053 +0000 UTC m=+0.824483321,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.553865 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.311811393 +0000 UTC m=+0.824503661,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.560692 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389dbc1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.311821713 +0000 UTC m=+0.824513981,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.567350 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.31290811 +0000 UTC m=+0.825600388,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.569372 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.312942281 +0000 UTC m=+0.825634559,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.575597 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389dbc1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.312961151 +0000 UTC m=+0.825653429,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.583642 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.318399173 +0000 UTC m=+0.831091441,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.588726 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.318415534 +0000 UTC m=+0.831107802,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.596036 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389dbc1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.318425294 +0000 UTC m=+0.831117562,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.602388 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.318475915 +0000 UTC m=+0.831168193,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.609040 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.318520486 +0000 UTC m=+0.831212764,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.616419 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389dbc1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.318532497 +0000 UTC m=+0.831224775,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.622769 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.319799967 +0000 UTC m=+0.832492275,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.630061 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.319834078 +0000 UTC m=+0.832526386,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.637567 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389dbc1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389dbc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198049217 +0000 UTC m=+0.710741505,LastTimestamp:2026-03-14 08:56:35.319857339 +0000 UTC m=+0.832549647,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.644758 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d3893d5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d3893d5a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198008666 +0000 UTC m=+0.710700944,LastTimestamp:2026-03-14 08:56:35.320802912 +0000 UTC m=+0.833495180,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.649998 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca962d389a27c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca962d389a27c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.198034556 +0000 UTC m=+0.710726834,LastTimestamp:2026-03-14 08:56:35.320821402 +0000 UTC m=+0.833513670,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.659478 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca962f3142713 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.727206163 +0000 UTC m=+1.239898471,LastTimestamp:2026-03-14 08:56:35.727206163 +0000 UTC m=+1.239898471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.667612 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca962f3a7f827 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.736893479 +0000 UTC m=+1.249585747,LastTimestamp:2026-03-14 08:56:35.736893479 +0000 UTC m=+1.249585747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.672578 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca962f3c166e5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.738560229 +0000 UTC m=+1.251252497,LastTimestamp:2026-03-14 08:56:35.738560229 +0000 UTC m=+1.251252497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.674655 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca962f4d06dd5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.756322261 +0000 UTC m=+1.269014559,LastTimestamp:2026-03-14 08:56:35.756322261 +0000 UTC m=+1.269014559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.683061 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca962f5a9e580 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:35.770574208 +0000 UTC m=+1.283266516,LastTimestamp:2026-03-14 08:56:35.770574208 +0000 UTC m=+1.283266516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.688724 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca9631a692e6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.38709003 +0000 UTC m=+1.899782318,LastTimestamp:2026-03-14 08:56:36.38709003 +0000 UTC m=+1.899782318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.693456 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9631a69a813 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.387121171 +0000 UTC m=+1.899813439,LastTimestamp:2026-03-14 08:56:36.387121171 +0000 UTC m=+1.899813439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.700234 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca9631adb221e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.394557982 +0000 UTC m=+1.907250250,LastTimestamp:2026-03-14 08:56:36.394557982 +0000 UTC m=+1.907250250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.707386 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9631aeb4441 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.395615297 +0000 UTC m=+1.908307565,LastTimestamp:2026-03-14 08:56:36.395615297 +0000 UTC m=+1.908307565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.712142 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca9631b5506aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.402546346 +0000 UTC m=+1.915238614,LastTimestamp:2026-03-14 08:56:36.402546346 +0000 UTC m=+1.915238614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.719045 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9631b58017f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.402741631 +0000 UTC m=+1.915433939,LastTimestamp:2026-03-14 08:56:36.402741631 +0000 UTC m=+1.915433939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.726835 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9631b647d65 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.403559781 +0000 UTC m=+1.916252099,LastTimestamp:2026-03-14 08:56:36.403559781 +0000 UTC m=+1.916252099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.732102 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca9631b71aa40 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.404423232 +0000 UTC m=+1.917115500,LastTimestamp:2026-03-14 08:56:36.404423232 +0000 UTC m=+1.917115500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.739146 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca9631c5a652b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.419675435 +0000 UTC m=+1.932367703,LastTimestamp:2026-03-14 08:56:36.419675435 +0000 UTC m=+1.932367703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.746927 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9631c889c29 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.422704169 +0000 UTC m=+1.935396467,LastTimestamp:2026-03-14 08:56:36.422704169 +0000 UTC m=+1.935396467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.751310 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9631c88d438 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.42271852 +0000 UTC m=+1.935410828,LastTimestamp:2026-03-14 08:56:36.42271852 +0000 UTC m=+1.935410828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.758004 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96330ac61a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.760592803 +0000 UTC m=+2.273285091,LastTimestamp:2026-03-14 08:56:36.760592803 +0000 UTC m=+2.273285091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.763068 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca963315c43d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.772119507 +0000 UTC m=+2.284811775,LastTimestamp:2026-03-14 08:56:36.772119507 +0000 UTC m=+2.284811775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.769126 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96331789345 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.773974853 +0000 UTC m=+2.286667121,LastTimestamp:2026-03-14 08:56:36.773974853 +0000 UTC m=+2.286667121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.776059 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca9633e4ccc31 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.989209649 +0000 UTC m=+2.501901957,LastTimestamp:2026-03-14 08:56:36.989209649 +0000 UTC m=+2.501901957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.781661 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96341915ad2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.044034258 +0000 UTC m=+2.556726556,LastTimestamp:2026-03-14 08:56:37.044034258 +0000 UTC m=+2.556726556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.788327 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96341b1f2da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.04617033 +0000 UTC m=+2.558862598,LastTimestamp:2026-03-14 08:56:37.04617033 +0000 UTC m=+2.558862598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.795919 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9634cd45795 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.232973717 +0000 UTC m=+2.745666025,LastTimestamp:2026-03-14 08:56:37.232973717 +0000 UTC m=+2.745666025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.803422 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9634d23e97f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.238188415 +0000 UTC m=+2.750880723,LastTimestamp:2026-03-14 08:56:37.238188415 +0000 UTC m=+2.750880723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.809141 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca9634d7d9916 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.24406607 +0000 UTC m=+2.756758338,LastTimestamp:2026-03-14 08:56:37.24406607 +0000 UTC m=+2.756758338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.814784 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9634de0aa69 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.250558569 +0000 UTC m=+2.763250877,LastTimestamp:2026-03-14 08:56:37.250558569 +0000 UTC m=+2.763250877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.823431 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96351e4e74f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.317945167 +0000 UTC m=+2.830637445,LastTimestamp:2026-03-14 08:56:37.317945167 +0000 UTC m=+2.830637445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.830513 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca9635332b680 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.339821696 +0000 UTC m=+2.852513964,LastTimestamp:2026-03-14 08:56:37.339821696 +0000 UTC m=+2.852513964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.837361 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9635b96525d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.480567389 +0000 UTC m=+2.993259657,LastTimestamp:2026-03-14 08:56:37.480567389 +0000 UTC m=+2.993259657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.845104 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9635ba55c1b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.481552923 +0000 UTC m=+2.994245191,LastTimestamp:2026-03-14 08:56:37.481552923 +0000 UTC m=+2.994245191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.847745 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.848324 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.850561 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" exitCode=255 Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.850617 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa"} Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.850676 4956 scope.go:117] "RemoveContainer" containerID="356eae237884acd7b18b80c8498ee1a7276bce84a459870157c26cd50a779851" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.850975 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.852460 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9635bab193e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.481929022 +0000 UTC m=+2.994621290,LastTimestamp:2026-03-14 08:56:37.481929022 +0000 UTC m=+2.994621290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.853009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.853036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.853075 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:40 crc kubenswrapper[4956]: I0314 08:57:40.854049 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.854371 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.859795 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca9635bb0ce24 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.482303012 +0000 UTC m=+2.994995280,LastTimestamp:2026-03-14 08:56:37.482303012 +0000 UTC m=+2.994995280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.865131 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9635cb18343 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.499126595 +0000 UTC m=+3.011818863,LastTimestamp:2026-03-14 08:56:37.499126595 +0000 UTC m=+3.011818863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.870755 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9635cc9af8a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.500710794 +0000 UTC m=+3.013403062,LastTimestamp:2026-03-14 08:56:37.500710794 +0000 UTC m=+3.013403062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.876340 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9635d8628bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.513062588 +0000 UTC m=+3.025754856,LastTimestamp:2026-03-14 08:56:37.513062588 +0000 UTC m=+3.025754856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.883083 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca9635d885adc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.513206492 +0000 UTC m=+3.025898760,LastTimestamp:2026-03-14 08:56:37.513206492 +0000 UTC m=+3.025898760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.888096 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9635d88f443 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.513245763 +0000 UTC m=+3.025938031,LastTimestamp:2026-03-14 08:56:37.513245763 +0000 UTC m=+3.025938031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.892650 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9635d955477 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.514056823 +0000 UTC m=+3.026749091,LastTimestamp:2026-03-14 08:56:37.514056823 +0000 UTC m=+3.026749091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.897046 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9636b6b2fb4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.746175924 +0000 UTC m=+3.258868212,LastTimestamp:2026-03-14 08:56:37.746175924 +0000 UTC m=+3.258868212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.902627 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9636ba6e533 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.750089011 +0000 UTC m=+3.262781269,LastTimestamp:2026-03-14 08:56:37.750089011 +0000 UTC m=+3.262781269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.908763 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9636c0018a4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.755934884 +0000 UTC m=+3.268627152,LastTimestamp:2026-03-14 08:56:37.755934884 +0000 UTC m=+3.268627152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.914161 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9636c13fa6b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.757237867 +0000 UTC m=+3.269930135,LastTimestamp:2026-03-14 08:56:37.757237867 +0000 UTC m=+3.269930135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.919787 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9636c32e452 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.759263826 +0000 UTC m=+3.271956094,LastTimestamp:2026-03-14 08:56:37.759263826 +0000 UTC m=+3.271956094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.927430 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9636c55e9b3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.761558963 +0000 UTC m=+3.274251231,LastTimestamp:2026-03-14 08:56:37.761558963 +0000 UTC m=+3.274251231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.932921 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca963792f52d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.977133777 +0000 UTC m=+3.489826045,LastTimestamp:2026-03-14 08:56:37.977133777 +0000 UTC m=+3.489826045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.938189 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9637943b13e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:37.97846867 +0000 UTC m=+3.491160938,LastTimestamp:2026-03-14 08:56:37.97846867 +0000 UTC m=+3.491160938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.943340 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9637b98a915 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.017591573 +0000 UTC m=+3.530283841,LastTimestamp:2026-03-14 08:56:38.017591573 +0000 UTC m=+3.530283841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.951309 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9637bb14bea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.019206122 +0000 UTC m=+3.531898410,LastTimestamp:2026-03-14 08:56:38.019206122 +0000 UTC m=+3.531898410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.960001 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca9637bc00daa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.020173226 +0000 UTC m=+3.532865494,LastTimestamp:2026-03-14 08:56:38.020173226 +0000 UTC m=+3.532865494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.965724 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca96385b0a8a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.186936489 +0000 UTC m=+3.699628757,LastTimestamp:2026-03-14 08:56:38.186936489 +0000 UTC m=+3.699628757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.970851 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca96386578da9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.197874089 +0000 UTC m=+3.710566347,LastTimestamp:2026-03-14 08:56:38.197874089 +0000 UTC m=+3.710566347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.976987 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca96386694d85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.199037317 +0000 UTC m=+3.711729585,LastTimestamp:2026-03-14 08:56:38.199037317 +0000 UTC m=+3.711729585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.982748 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9638adc2128 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.273671464 +0000 UTC m=+3.786363732,LastTimestamp:2026-03-14 08:56:38.273671464 +0000 UTC m=+3.786363732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.987503 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca963aab1fa74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.807779956 +0000 UTC m=+4.320472274,LastTimestamp:2026-03-14 08:56:38.807779956 +0000 UTC m=+4.320472274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.992287 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963ab017c0a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.812990474 +0000 UTC m=+4.325682742,LastTimestamp:2026-03-14 08:56:38.812990474 +0000 UTC m=+4.325682742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:40 crc kubenswrapper[4956]: E0314 08:57:40.998435 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca963abb804ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.824953038 +0000 UTC m=+4.337645306,LastTimestamp:2026-03-14 08:56:38.824953038 +0000 UTC m=+4.337645306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.004242 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963abe3af99 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.827814809 +0000 UTC m=+4.340507077,LastTimestamp:2026-03-14 08:56:38.827814809 +0000 UTC m=+4.340507077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.008975 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963c6cea9e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:39.27942192 +0000 UTC m=+4.792114228,LastTimestamp:2026-03-14 08:56:39.27942192 +0000 UTC m=+4.792114228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.015348 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963d82a9146 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:39.570657606 +0000 UTC m=+5.083349874,LastTimestamp:2026-03-14 08:56:39.570657606 +0000 UTC m=+5.083349874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.019433 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963dad60e7c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:39.615450748 +0000 UTC m=+5.128143016,LastTimestamp:2026-03-14 08:56:39.615450748 +0000 UTC m=+5.128143016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.027677 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963daef5865 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:39.617108069 +0000 UTC m=+5.129800377,LastTimestamp:2026-03-14 08:56:39.617108069 +0000 UTC m=+5.129800377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.034997 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963f1a69304 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:39.998214916 +0000 UTC m=+5.510907184,LastTimestamp:2026-03-14 08:56:39.998214916 +0000 UTC m=+5.510907184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.039683 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963f489c7b4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.046659508 +0000 UTC m=+5.559351776,LastTimestamp:2026-03-14 08:56:40.046659508 +0000 UTC m=+5.559351776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.045792 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca963f4a6c983 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.048560515 +0000 UTC m=+5.561252823,LastTimestamp:2026-03-14 08:56:40.048560515 +0000 UTC m=+5.561252823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.050592 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca964001c8889 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.240826505 +0000 UTC m=+5.753518783,LastTimestamp:2026-03-14 08:56:40.240826505 +0000 UTC m=+5.753518783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.055190 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca96402ae85bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.283948476 +0000 UTC m=+5.796640744,LastTimestamp:2026-03-14 08:56:40.283948476 +0000 UTC m=+5.796640744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.060415 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca96402c2dc8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.285281419 +0000 UTC m=+5.797973687,LastTimestamp:2026-03-14 08:56:40.285281419 +0000 UTC m=+5.797973687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.066135 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca96386694d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca96386694d85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.199037317 +0000 UTC m=+3.711729585,LastTimestamp:2026-03-14 08:56:40.302599415 +0000 UTC m=+5.815291673,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.072799 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9640f1b4646 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.492402246 +0000 UTC m=+6.005094504,LastTimestamp:2026-03-14 08:56:40.492402246 +0000 UTC m=+6.005094504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.078882 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca963aab1fa74\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca963aab1fa74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.807779956 +0000 UTC m=+4.320472274,LastTimestamp:2026-03-14 08:56:40.493595875 +0000 UTC m=+6.006288173,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.085385 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca963abb804ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca963abb804ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:38.824953038 +0000 UTC m=+4.337645306,LastTimestamp:2026-03-14 08:56:40.505865237 +0000 UTC m=+6.018557495,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.091759 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9640ffdaae5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.507239141 +0000 UTC m=+6.019931409,LastTimestamp:2026-03-14 08:56:40.507239141 +0000 UTC m=+6.019931409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.095185 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca964101050e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.508461281 +0000 UTC m=+6.021153549,LastTimestamp:2026-03-14 08:56:40.508461281 +0000 UTC m=+6.021153549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.100270 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9641be0b2d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.706667218 +0000 UTC m=+6.219359486,LastTimestamp:2026-03-14 08:56:40.706667218 +0000 UTC m=+6.219359486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.106223 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca9641c7f8ea8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:40.717078184 +0000 UTC m=+6.229770452,LastTimestamp:2026-03-14 08:56:40.717078184 +0000 UTC m=+6.229770452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.112068 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca9647073955b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 08:57:41 crc kubenswrapper[4956]: body: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:42.125579611 +0000 UTC m=+7.638271919,LastTimestamp:2026-03-14 08:56:42.125579611 +0000 UTC m=+7.638271919,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.116073 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca964707879c5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:42.125900229 +0000 UTC m=+7.638592577,LastTimestamp:2026-03-14 08:56:42.125900229 +0000 UTC m=+7.638592577,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.122981 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-apiserver-crc.189ca9666008b4d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 08:57:41 crc kubenswrapper[4956]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:57:41 crc kubenswrapper[4956]: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:50.440074451 +0000 UTC m=+15.952766729,LastTimestamp:2026-03-14 08:56:50.440074451 +0000 UTC m=+15.952766729,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.128339 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9666009877e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:50.440128382 +0000 UTC m=+15.952820670,LastTimestamp:2026-03-14 08:56:50.440128382 +0000 UTC m=+15.952820670,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.134634 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca9666008b4d3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-apiserver-crc.189ca9666008b4d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 08:57:41 crc kubenswrapper[4956]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:57:41 crc kubenswrapper[4956]: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:50.440074451 +0000 UTC m=+15.952766729,LastTimestamp:2026-03-14 08:56:50.446621952 +0000 UTC m=+15.959314220,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.139614 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca9666009877e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca9666009877e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:50.440128382 +0000 UTC m=+15.952820670,LastTimestamp:2026-03-14 08:56:50.446658223 +0000 UTC m=+15.959350481,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.145588 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca966c487773c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:57:41 crc kubenswrapper[4956]: body: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:52.126103356 +0000 UTC m=+17.638795654,LastTimestamp:2026-03-14 08:56:52.126103356 +0000 UTC m=+17.638795654,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.148749 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca966c488b9b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:52.126185908 +0000 UTC m=+17.638878206,LastTimestamp:2026-03-14 08:56:52.126185908 +0000 UTC m=+17.638878206,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.149595 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.151518 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca9647073955b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca9647073955b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 08:57:41 crc kubenswrapper[4956]: body: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:42.125579611 +0000 UTC m=+7.638271919,LastTimestamp:2026-03-14 08:57:02.125837191 +0000 UTC m=+27.638529479,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.155839 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca964707879c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca964707879c5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:42.125900229 +0000 UTC m=+7.638592577,LastTimestamp:2026-03-14 08:57:02.125898503 +0000 UTC m=+27.638590781,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.161587 4956 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96918bfecd2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:57:02.12902421 +0000 UTC m=+27.641716488,LastTimestamp:2026-03-14 08:57:02.12902421 +0000 UTC m=+27.641716488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.166731 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca9631b71aa40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca9631b71aa40 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.404423232 +0000 UTC m=+1.917115500,LastTimestamp:2026-03-14 08:57:02.254778964 +0000 UTC m=+27.767471272,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.171023 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca96330ac61a3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca96330ac61a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.760592803 +0000 UTC m=+2.273285091,LastTimestamp:2026-03-14 08:57:02.506564949 +0000 UTC m=+28.019257257,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.175799 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca963315c43d3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca963315c43d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:36.772119507 +0000 UTC m=+2.284811775,LastTimestamp:2026-03-14 08:57:02.519987299 +0000 UTC m=+28.032679577,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.183744 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca966c487773c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca966c487773c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:57:41 crc kubenswrapper[4956]: body: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:52.126103356 +0000 UTC m=+17.638795654,LastTimestamp:2026-03-14 08:57:12.125609871 +0000 UTC m=+37.638302159,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.188032 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca966c488b9b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca966c488b9b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:52.126185908 +0000 UTC m=+17.638878206,LastTimestamp:2026-03-14 08:57:12.125715653 +0000 UTC m=+37.638407921,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.193437 4956 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca966c487773c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:57:41 crc kubenswrapper[4956]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca966c487773c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:57:41 crc kubenswrapper[4956]: body: Mar 14 08:57:41 crc kubenswrapper[4956]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:56:52.126103356 +0000 UTC m=+17.638795654,LastTimestamp:2026-03-14 08:57:22.125854199 +0000 UTC m=+47.638546477,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:57:41 crc kubenswrapper[4956]: > Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.663014 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.855929 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.859273 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.860471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.860536 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.860554 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:41 crc kubenswrapper[4956]: I0314 08:57:41.861278 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:57:41 crc kubenswrapper[4956]: E0314 08:57:41.861511 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:42 crc kubenswrapper[4956]: I0314 08:57:42.125382 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:57:42 crc kubenswrapper[4956]: I0314 08:57:42.125566 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:57:42 crc kubenswrapper[4956]: I0314 08:57:42.152528 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:43 crc kubenswrapper[4956]: I0314 08:57:43.153436 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.154749 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.462302 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.462616 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.464599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.464822 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.464962 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:44 crc kubenswrapper[4956]: I0314 08:57:44.465913 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:57:44 crc kubenswrapper[4956]: E0314 08:57:44.466403 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.150657 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.209028 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.210502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.210535 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.210543 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:45 crc kubenswrapper[4956]: E0314 08:57:45.306742 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:57:45 crc kubenswrapper[4956]: E0314 08:57:45.901425 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.908533 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.910429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.910524 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.910548 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:45 crc kubenswrapper[4956]: I0314 08:57:45.910602 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:45 crc kubenswrapper[4956]: E0314 08:57:45.918091 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:57:46 crc kubenswrapper[4956]: I0314 08:57:46.155027 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:47 crc kubenswrapper[4956]: I0314 08:57:47.152510 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:48 crc kubenswrapper[4956]: I0314 08:57:48.153996 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:48 crc kubenswrapper[4956]: I0314 08:57:48.405363 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:57:48 crc kubenswrapper[4956]: I0314 08:57:48.434997 4956 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.142408 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.142587 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.143727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.143778 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.143790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.147384 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.151459 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.881854 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.883669 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.883731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:49 crc kubenswrapper[4956]: I0314 08:57:49.883752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:50 crc kubenswrapper[4956]: I0314 08:57:50.150357 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:57:50 crc kubenswrapper[4956]: I0314 08:57:50.700074 4956 csr.go:261] certificate signing request csr-vx6fd is approved, waiting to be issued Mar 14 08:57:50 crc kubenswrapper[4956]: I0314 08:57:50.708102 4956 csr.go:257] certificate signing request csr-vx6fd is issued Mar 14 08:57:50 crc kubenswrapper[4956]: I0314 08:57:50.740958 4956 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 14 08:57:50 crc kubenswrapper[4956]: I0314 08:57:50.996174 4956 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 14 08:57:51 crc kubenswrapper[4956]: I0314 08:57:51.710104 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 13:44:08.038841378 +0000 UTC Mar 14 08:57:51 crc kubenswrapper[4956]: I0314 08:57:51.710174 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6316h46m16.328674719s for next certificate rotation Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.446426 4956 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.918467 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.920005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.920080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.920101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.920232 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.931362 4956 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.931678 4956 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 14 08:57:52 crc kubenswrapper[4956]: E0314 08:57:52.931702 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.942627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.942685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.942703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.942728 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.942749 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:52Z","lastTransitionTime":"2026-03-14T08:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:52 crc kubenswrapper[4956]: E0314 08:57:52.978780 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.985318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.985348 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.985356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.985370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:52 crc kubenswrapper[4956]: I0314 08:57:52.985380 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:52Z","lastTransitionTime":"2026-03-14T08:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.000397 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.003172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.003206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.003218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.003233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.003247 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:53Z","lastTransitionTime":"2026-03-14T08:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.013513 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.016683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.016718 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.016726 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.016740 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:53 crc kubenswrapper[4956]: I0314 08:57:53.016751 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:53Z","lastTransitionTime":"2026-03-14T08:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.026593 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.026703 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.026733 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.126852 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.227424 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.328192 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.428917 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.529630 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.630128 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.731125 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.831333 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:53 crc kubenswrapper[4956]: E0314 08:57:53.932296 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.033290 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.134138 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.235354 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.336198 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.436590 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.537187 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.638250 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.738955 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.840066 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:54 crc kubenswrapper[4956]: E0314 08:57:54.940641 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.041335 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.141536 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.242472 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.307585 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.343184 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.444012 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.544595 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.645226 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.746226 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.847171 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:55 crc kubenswrapper[4956]: E0314 08:57:55.947671 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.048503 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.149539 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: I0314 08:57:56.208581 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:57:56 crc kubenswrapper[4956]: I0314 08:57:56.209708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:56 crc kubenswrapper[4956]: I0314 08:57:56.209830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:56 crc kubenswrapper[4956]: I0314 08:57:56.209854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:56 crc kubenswrapper[4956]: I0314 08:57:56.210798 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.211108 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.249930 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.350806 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.451829 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.552965 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.653566 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.754660 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.855871 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:56 crc kubenswrapper[4956]: E0314 08:57:56.957049 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.058167 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.158992 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.259448 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.360518 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.461746 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.562017 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.663187 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.764338 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.864471 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:57 crc kubenswrapper[4956]: E0314 08:57:57.964644 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:58 crc kubenswrapper[4956]: E0314 08:57:58.065666 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:58 crc kubenswrapper[4956]: E0314 08:57:58.166785 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:58 crc kubenswrapper[4956]: E0314 08:57:58.267010 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.332324 4956 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.369232 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.369265 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.369276 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.369291 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.369302 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.471974 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.472043 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.472061 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.472122 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.472145 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.574337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.574381 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.574399 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.574420 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.574434 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.677362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.677411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.677428 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.677451 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.677469 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.780007 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.780074 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.780097 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.780122 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.780139 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.883681 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.883751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.883771 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.883796 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.883814 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.986924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.987005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.987028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.987059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:58 crc kubenswrapper[4956]: I0314 08:57:58.987084 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:58Z","lastTransitionTime":"2026-03-14T08:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.090100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.090164 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.090181 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.090208 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.090226 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.174895 4956 apiserver.go:52] "Watching apiserver" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.180038 4956 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.180349 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.180815 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.180980 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.181076 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.181100 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.181202 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.181357 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.181675 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.181779 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.181838 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.184194 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.184544 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.184621 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.184691 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.185138 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.185282 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.185285 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.186051 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.186086 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.192659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.192706 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.192720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.192742 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.192757 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.228055 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.248444 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.248618 4956 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.270233 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.285514 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.295533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.295583 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.295599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.295645 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.295663 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299152 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299431 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299517 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299578 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299615 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299650 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299708 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.299989 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.300032 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.300209 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.300622 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.300696 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.300740 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.300940 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301196 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301233 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301289 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301332 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301372 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301477 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301555 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301329 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301419 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301593 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301676 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301707 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301737 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301766 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301802 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301918 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301922 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301951 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.301982 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302014 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302065 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302172 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302220 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302314 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302266 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302352 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302367 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302405 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302541 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302622 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302695 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302726 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302767 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302767 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302801 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302931 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302961 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302991 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303022 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303129 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303281 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303323 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303356 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303389 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303476 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303532 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303707 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303842 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303881 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305382 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305561 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305672 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305716 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305790 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305833 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305870 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305915 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305958 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306041 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306088 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306223 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306284 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306329 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306382 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306590 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306641 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306722 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306881 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306928 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306965 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307069 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307148 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307337 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307391 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307716 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307764 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307815 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307856 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308348 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308442 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308509 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308626 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308673 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308716 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308757 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308944 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308995 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309111 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309171 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309205 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309282 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310336 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310387 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310430 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.302984 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303022 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303218 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303276 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.311100 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303328 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303627 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.303783 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305176 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305738 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305823 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.305959 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306027 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306286 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.306408 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.311518 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307034 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307240 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307402 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307939 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.307997 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308157 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308526 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.308987 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309108 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309139 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309477 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309719 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.309681 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310080 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310161 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310453 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310684 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310710 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.311053 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.311438 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.311960 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312223 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312257 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312300 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.310745 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312432 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312505 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312695 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312754 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312847 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312884 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312926 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312966 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313019 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313042 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313140 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313214 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313256 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313384 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313424 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.312995 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313468 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313533 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313561 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313574 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313672 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313791 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314082 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315413 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313674 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313699 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.313806 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314044 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314424 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314676 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314739 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314761 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314850 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314936 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.314980 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315379 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315453 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315813 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315859 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.316009 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.316188 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.316316 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315255 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.316827 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317127 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317315 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.315518 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.316683 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317566 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.316712 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317644 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317667 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317839 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317880 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318087 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318158 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318202 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318241 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318285 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318328 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318367 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318400 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318453 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318525 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318568 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318633 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318683 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318725 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318773 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318811 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318851 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318884 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318923 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318962 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319003 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319043 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319086 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319128 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319171 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319219 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319267 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319302 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319351 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319399 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317835 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317925 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317939 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319632 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.317973 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318928 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.318943 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319789 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319838 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321202 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321295 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321358 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321411 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321458 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321548 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321598 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321645 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321741 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321790 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321840 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321892 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321954 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322040 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322098 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.320307 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319842 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321366 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321586 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321849 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321892 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.319899 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322104 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.321978 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322370 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322621 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322968 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.322149 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323467 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323785 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323867 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323946 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323985 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324059 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324427 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324529 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324615 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324657 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324826 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324972 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325064 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325233 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325315 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325355 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325764 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325837 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325873 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325906 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325939 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325975 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326008 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326039 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326071 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326104 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323075 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323185 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323406 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323458 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323505 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323709 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323737 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323759 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.323957 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324011 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324031 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324081 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324619 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324657 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324923 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.324956 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325153 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325228 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325624 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326425 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.325786 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326128 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326579 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326643 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326955 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.327196 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.327470 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.326224 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.327999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328046 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328089 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328114 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328120 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328192 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328218 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328239 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328259 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328178 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328277 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328403 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328445 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328606 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328518 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328693 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328713 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.328732 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.329676 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.329699 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.330230 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.330300 4956 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.330658 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.330797 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:57:59.830760637 +0000 UTC m=+85.343452965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.330911 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331028 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331174 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.331665 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.331807 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:57:59.831777532 +0000 UTC m=+85.344469850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331909 4956 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331943 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331965 4956 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331955 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.331983 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332003 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332022 4956 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332039 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332058 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332076 4956 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332094 4956 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332111 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332129 4956 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332148 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332166 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332183 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332204 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332221 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332239 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332257 4956 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332275 4956 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332293 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332310 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332328 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332328 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332346 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332454 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332544 4956 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332565 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332624 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332645 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332663 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332717 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332737 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332803 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332825 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332843 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332901 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332922 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332940 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333000 4956 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333018 4956 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333036 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333092 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333113 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333132 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333189 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333207 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333229 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333288 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333307 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333362 4956 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333383 4956 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333401 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333452 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333473 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333541 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333560 4956 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333578 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333596 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333653 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333672 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333690 4956 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333707 4956 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333724 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333743 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333799 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333834 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333855 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333875 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333900 4956 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333920 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333938 4956 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333956 4956 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333979 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334059 4956 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334085 4956 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334110 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334130 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334152 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334171 4956 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334191 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334210 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334229 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334250 4956 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334268 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334287 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334307 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334325 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334343 4956 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334362 4956 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334381 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334444 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334464 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334499 4956 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334517 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334536 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334554 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334571 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334589 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334607 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334624 4956 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334642 4956 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334660 4956 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334679 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334697 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334716 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334739 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334762 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334786 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334818 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334838 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334855 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334872 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334889 4956 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334906 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334962 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334981 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335001 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335019 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335038 4956 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335055 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335074 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335093 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335157 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335183 4956 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335206 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335228 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335247 4956 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335264 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335282 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335299 4956 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335316 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335339 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.332456 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333406 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.333881 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.334394 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335056 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335511 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335510 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335622 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335806 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335870 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.335880 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.335943 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:57:59.83590657 +0000 UTC m=+85.348598878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.336269 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.336537 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.336560 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.336961 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.337257 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.337329 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.337972 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.338013 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.338399 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.338881 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.338895 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.338954 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.339428 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.339733 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.340916 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.341224 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.350980 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.351280 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.351610 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.351639 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.351660 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.351652 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.351769 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:57:59.851739029 +0000 UTC m=+85.364431387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.352070 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.352382 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.352563 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.352776 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.354239 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.356033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.356251 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.357694 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.358024 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.358042 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.358066 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.358082 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.358134 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:57:59.858116782 +0000 UTC m=+85.370809150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.358328 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.360033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.360759 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.360843 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361207 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361358 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361409 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361428 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361593 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361891 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361918 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.361993 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.362384 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.362855 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.363443 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.364155 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.368728 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.368959 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.369759 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.370847 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.386301 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.394932 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.398083 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.398110 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.398118 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.398100 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.398131 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.398197 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.436957 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437167 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437261 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437325 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437340 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437371 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437385 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437402 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437414 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437425 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437435 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437447 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437457 4956 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437467 4956 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437500 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437511 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437521 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437531 4956 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437546 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437558 4956 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437569 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437580 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437592 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437603 4956 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437614 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437625 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437635 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437646 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437656 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437666 4956 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437677 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437690 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437701 4956 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437750 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437769 4956 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437782 4956 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437793 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437804 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437814 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437825 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437835 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437845 4956 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437857 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437868 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437879 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437889 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437900 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437911 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437923 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437934 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437945 4956 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437955 4956 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437966 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437986 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.437997 4956 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438008 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438019 4956 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438030 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438040 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438055 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438066 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438077 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438089 4956 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438099 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.438110 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.500950 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.500987 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.501001 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.501018 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.501030 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.502735 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.519716 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.530548 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:57:59 crc kubenswrapper[4956]: W0314 08:57:59.535700 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9ab838f4fba76290311bf1e8de601c4c858f77d768d4e195340b837de405e1d8 WatchSource:0}: Error finding container 9ab838f4fba76290311bf1e8de601c4c858f77d768d4e195340b837de405e1d8: Status 404 returned error can't find the container with id 9ab838f4fba76290311bf1e8de601c4c858f77d768d4e195340b837de405e1d8 Mar 14 08:57:59 crc kubenswrapper[4956]: W0314 08:57:59.557597 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-47723beb05f4016710b1ab07558f9b964a230d88e66a833d5e39d18e3cc48120 WatchSource:0}: Error finding container 47723beb05f4016710b1ab07558f9b964a230d88e66a833d5e39d18e3cc48120: Status 404 returned error can't find the container with id 47723beb05f4016710b1ab07558f9b964a230d88e66a833d5e39d18e3cc48120 Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.603708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.603748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.603788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.603805 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.603819 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.706736 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.706794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.706815 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.706843 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.706860 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.809231 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.809287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.809306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.809334 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.809352 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.841010 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.841138 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.841185 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.841266 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:58:00.84120981 +0000 UTC m=+86.353902118 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.841293 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.841314 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.841385 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:00.841365503 +0000 UTC m=+86.354057801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.841414 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:00.841401534 +0000 UTC m=+86.354093832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.911657 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.911695 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.911703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.911719 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.911728 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:57:59Z","lastTransitionTime":"2026-03-14T08:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.912668 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.912747 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.912773 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9ab838f4fba76290311bf1e8de601c4c858f77d768d4e195340b837de405e1d8"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.914954 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.915001 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b75514bd721786ed91134e9d3b1824936a597799f1386248639cf0bf70575bd8"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.915882 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"47723beb05f4016710b1ab07558f9b964a230d88e66a833d5e39d18e3cc48120"} Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.923412 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.933748 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.941706 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.941763 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.941934 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.941961 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.941965 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.941984 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.942006 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.942024 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.942068 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:00.942047012 +0000 UTC m=+86.454739310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: E0314 08:57:59.942101 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:00.942079383 +0000 UTC m=+86.454771731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.943244 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.952984 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.969542 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.979624 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:57:59 crc kubenswrapper[4956]: I0314 08:57:59.991334 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.007429 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.014132 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.014162 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.014171 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.014187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.014201 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.016625 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.025529 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.035371 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.044697 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.116749 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.116808 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.116826 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.116855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.116874 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.219374 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.219411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.219421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.219439 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.219449 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.322603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.322658 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.322675 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.322697 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.322713 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.426335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.426377 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.426390 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.426410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.426424 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.529356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.529417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.529434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.529459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.529477 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.632558 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.632593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.632601 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.632614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.632625 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.734831 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.734915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.734949 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.734977 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.735002 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.836976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.837023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.837038 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.837059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.837076 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.850909 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.851042 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.851168 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:58:02.851119662 +0000 UTC m=+88.363811970 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.851177 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.851303 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.851389 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:02.851361308 +0000 UTC m=+88.364053596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.851601 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.851727 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:02.851688685 +0000 UTC m=+88.364380993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.939336 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.939395 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.939418 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.939448 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.939473 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:00Z","lastTransitionTime":"2026-03-14T08:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.951951 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:00 crc kubenswrapper[4956]: I0314 08:58:00.951984 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952088 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952102 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952112 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952167 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:02.952134929 +0000 UTC m=+88.464827187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952241 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952281 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952300 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:00 crc kubenswrapper[4956]: E0314 08:58:00.952352 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:02.952335413 +0000 UTC m=+88.465027721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.041749 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.041799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.041814 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.041835 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.041852 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.144967 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.145004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.145014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.145030 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.145041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.208644 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.208668 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.208758 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:01 crc kubenswrapper[4956]: E0314 08:58:01.208760 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:01 crc kubenswrapper[4956]: E0314 08:58:01.208945 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:01 crc kubenswrapper[4956]: E0314 08:58:01.209042 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.212443 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.212995 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.213760 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.214308 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.214842 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.215321 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.215876 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.216385 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.216988 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.217459 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.217925 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.218570 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.219014 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.219506 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.219976 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.220455 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.220962 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.221330 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.225605 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.226276 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.227274 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.227968 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.228440 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.229464 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.230133 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.231124 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.231861 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.232872 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.233459 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.234507 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.234999 4956 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.235131 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.237433 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.240042 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.240456 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.241879 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.242824 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.243301 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.244281 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.244958 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.245742 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.251894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.251931 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.251941 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.251959 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.251967 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.253828 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.254618 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.255638 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.256169 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.257120 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.257627 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.258831 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.259398 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.260402 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.260864 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.261345 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.262272 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.262856 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.354698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.354736 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.354746 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.354761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.354773 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.456970 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.457008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.457019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.457038 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.457050 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.558851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.558876 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.558883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.558896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.558904 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.661802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.661843 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.661853 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.661870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.661880 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.684377 4956 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.764054 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.764120 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.764140 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.764165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.764186 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.866279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.866316 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.866325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.866339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.866348 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.969235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.969723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.969744 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.969771 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:01 crc kubenswrapper[4956]: I0314 08:58:01.969789 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:01Z","lastTransitionTime":"2026-03-14T08:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.072405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.072447 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.072456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.072470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.072522 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.174760 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.174826 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.174849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.174879 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.174903 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.277772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.277823 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.277841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.277863 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.277880 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.380510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.380549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.380563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.380605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.380617 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.483106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.483159 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.483176 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.483197 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.483215 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.585699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.585751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.585770 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.585789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.585803 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.688182 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.688235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.688245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.688261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.688292 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.790300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.790348 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.790359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.790378 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.790391 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.867909 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.868023 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.868064 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.868206 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.868292 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:58:06.868258581 +0000 UTC m=+92.380950869 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.868317 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.868404 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:06.868388344 +0000 UTC m=+92.381080732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.868461 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:06.868431225 +0000 UTC m=+92.381123543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.892706 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.892769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.892785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.892804 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.892814 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.925702 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2"} Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.939664 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:02Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.954058 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:02Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.967949 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:02Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968600 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.968450 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968630 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968645 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.968673 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968691 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:06.968672353 +0000 UTC m=+92.481364621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968886 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968918 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968936 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:02 crc kubenswrapper[4956]: E0314 08:58:02.968990 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:06.968974771 +0000 UTC m=+92.481667049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.980565 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:02Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.991503 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:02Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.995020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.995076 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.995093 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.995116 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:02 crc kubenswrapper[4956]: I0314 08:58:02.995136 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:02Z","lastTransitionTime":"2026-03-14T08:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.002773 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:03Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.082947 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.083009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.083027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.083050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.083069 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.099277 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:03Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.102952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.102999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.103011 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.103029 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.103041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.116346 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:03Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.120009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.120039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.120048 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.120063 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.120076 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.135982 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:03Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.139714 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.139764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.139774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.139791 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.139802 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.150105 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:03Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.152981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.153012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.153020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.153034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.153043 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.164638 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:03Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.164755 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.166280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.166307 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.166315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.166327 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.166336 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.209310 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.209395 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.209402 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.209445 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.209514 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:03 crc kubenswrapper[4956]: E0314 08:58:03.209638 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.269719 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.269789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.269806 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.269829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.269846 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.372980 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.373021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.373034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.373053 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.373067 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.476202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.476321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.476341 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.476372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.476394 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.579653 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.579724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.579751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.579780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.579800 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.681972 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.682384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.682590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.682801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.682965 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.785598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.785654 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.785671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.785698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.785716 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.865538 4956 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.888047 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.888100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.888123 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.888149 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.888169 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.992840 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.992882 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.992892 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.992933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:03 crc kubenswrapper[4956]: I0314 08:58:03.992947 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:03Z","lastTransitionTime":"2026-03-14T08:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.096305 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.096416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.096442 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.096470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.096529 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.199268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.199314 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.199330 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.199353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.199370 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.301744 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.302004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.302016 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.302028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.302037 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.404466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.404553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.404569 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.404594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.404612 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.506655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.506707 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.506721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.506738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.507111 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.609562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.609616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.609629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.609648 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.609660 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.712680 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.712765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.712797 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.712826 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.712848 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.815575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.815641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.815660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.815689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.815708 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.919278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.919344 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.919361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.919386 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:04 crc kubenswrapper[4956]: I0314 08:58:04.919403 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:04Z","lastTransitionTime":"2026-03-14T08:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.022981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.023019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.023033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.023049 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.023061 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.125774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.125835 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.125846 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.125860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.125870 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.209180 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.209180 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.209345 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:05 crc kubenswrapper[4956]: E0314 08:58:05.209451 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:05 crc kubenswrapper[4956]: E0314 08:58:05.209566 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:05 crc kubenswrapper[4956]: E0314 08:58:05.209660 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.227340 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.228938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.229003 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.229019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.229057 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.229068 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.241830 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.257716 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.275441 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.287207 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.301242 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.330599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.330648 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.330664 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.330685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.330701 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.433089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.433164 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.433187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.433216 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.433238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.535474 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.535564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.535583 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.535608 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.535625 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.638793 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.638833 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.638846 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.638862 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.638872 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.741232 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.741278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.741293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.741311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.741325 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.844242 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.844316 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.844328 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.844344 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.844360 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.947243 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.947310 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.947351 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.947385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:05 crc kubenswrapper[4956]: I0314 08:58:05.947410 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:05Z","lastTransitionTime":"2026-03-14T08:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.050356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.050416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.050434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.050458 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.050476 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.153769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.153862 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.153884 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.153908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.153924 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.256334 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.256733 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.256900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.257069 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.257221 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.360516 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.360576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.360595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.360621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.360639 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.462715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.463074 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.463208 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.463347 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.463477 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.565510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.565563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.565574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.565592 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.565604 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.668337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.668663 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.668968 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.669104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.669225 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.771739 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.771775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.771785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.771803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.771813 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.873859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.873889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.873897 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.873944 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.873955 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.900800 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.900921 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.901019 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:06 crc kubenswrapper[4956]: E0314 08:58:06.901142 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:06 crc kubenswrapper[4956]: E0314 08:58:06.901233 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:14.901204859 +0000 UTC m=+100.413897167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:06 crc kubenswrapper[4956]: E0314 08:58:06.901278 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:58:14.90125091 +0000 UTC m=+100.413943228 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:58:06 crc kubenswrapper[4956]: E0314 08:58:06.901350 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:06 crc kubenswrapper[4956]: E0314 08:58:06.901459 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:14.901432304 +0000 UTC m=+100.414124612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.976585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.976643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.976659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.976683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:06 crc kubenswrapper[4956]: I0314 08:58:06.976698 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:06Z","lastTransitionTime":"2026-03-14T08:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.002412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.002516 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002664 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002675 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002726 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002750 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002687 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002821 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002831 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:15.002803779 +0000 UTC m=+100.515496077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.002872 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:15.002856491 +0000 UTC m=+100.515548769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.079323 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.079364 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.079372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.079387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.079396 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.181930 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.181973 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.181983 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.181998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.182010 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.208876 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.208918 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.209071 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.208889 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.209221 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:07 crc kubenswrapper[4956]: E0314 08:58:07.209434 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.284575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.284823 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.284924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.284999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.285058 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.387950 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.388005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.388020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.388041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.388060 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.493989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.494060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.494078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.494103 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.494120 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.597588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.597678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.597696 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.597721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.597737 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.700641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.700703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.700732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.700756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.700777 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.803658 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.803711 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.803728 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.803750 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.803768 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.909042 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.909689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.909811 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.909901 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:07 crc kubenswrapper[4956]: I0314 08:58:07.909978 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:07Z","lastTransitionTime":"2026-03-14T08:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.013022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.013090 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.013112 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.013141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.013163 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.116050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.116114 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.116130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.116152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.116171 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.218933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.218992 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.219009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.219032 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.219049 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.321743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.321774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.321783 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.321797 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.321805 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.424594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.424629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.424637 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.424650 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.424658 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.526798 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.526858 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.526877 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.526909 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.526927 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.629669 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.629752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.629780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.629810 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.629827 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.733002 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.733058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.733074 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.733100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.733121 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.835076 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.835105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.835113 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.835126 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.835136 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.938449 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.938599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.938614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.938631 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:08 crc kubenswrapper[4956]: I0314 08:58:08.938646 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:08Z","lastTransitionTime":"2026-03-14T08:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.041525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.041593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.041603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.041635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.041647 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.143380 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.143422 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.143434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.143451 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.143464 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.209250 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.209343 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.209277 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:09 crc kubenswrapper[4956]: E0314 08:58:09.209439 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:09 crc kubenswrapper[4956]: E0314 08:58:09.209598 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:09 crc kubenswrapper[4956]: E0314 08:58:09.209801 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.246681 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.246756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.246773 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.246798 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.246815 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.349346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.349423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.349443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.349467 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.349524 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.452728 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.452781 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.452791 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.452804 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.452831 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.556094 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.556178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.556196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.556706 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.556770 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.660568 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.660615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.660625 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.660642 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.660653 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.763115 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.763201 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.763217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.763239 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.763256 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.865325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.865363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.865373 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.865389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.865401 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.968296 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.968342 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.968358 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.968374 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:09 crc kubenswrapper[4956]: I0314 08:58:09.968385 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:09Z","lastTransitionTime":"2026-03-14T08:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.070362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.070419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.070441 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.070468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.070542 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.172837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.172879 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.172891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.172908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.172922 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.275655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.275689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.275698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.275712 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.275722 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.378380 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.378408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.378416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.378429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.378438 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.481145 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.481188 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.481199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.481215 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.481231 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.583235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.583303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.583325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.583355 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.583376 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.686343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.686639 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.686740 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.686839 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.686936 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.789142 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.789205 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.789222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.789245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.789262 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.891847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.891904 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.891924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.891951 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.891977 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.995042 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.995110 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.995127 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.995152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:10 crc kubenswrapper[4956]: I0314 08:58:10.995170 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:10Z","lastTransitionTime":"2026-03-14T08:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.098078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.098138 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.098160 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.098189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.098210 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.201453 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.201575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.201593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.201614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.201633 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.208884 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:11 crc kubenswrapper[4956]: E0314 08:58:11.209087 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.209179 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:11 crc kubenswrapper[4956]: E0314 08:58:11.209332 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.209785 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:11 crc kubenswrapper[4956]: E0314 08:58:11.210117 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.225942 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:58:11 crc kubenswrapper[4956]: E0314 08:58:11.226595 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.227097 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.304535 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.304578 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.304590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.304608 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.304621 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.407147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.407181 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.407195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.407212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.407223 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.510095 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.510136 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.510150 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.510167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.510179 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.612536 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.612560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.612569 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.612581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.612589 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.715067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.715123 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.715141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.715200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.715217 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.819182 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.819242 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.819264 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.819293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.819318 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.921894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.921971 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.921998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.922027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.922046 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:11Z","lastTransitionTime":"2026-03-14T08:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:11 crc kubenswrapper[4956]: I0314 08:58:11.950789 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:58:11 crc kubenswrapper[4956]: E0314 08:58:11.951092 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.024299 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.024360 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.024370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.024384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.024392 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.126798 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.126857 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.126872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.126890 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.126902 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.229661 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.229707 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.229718 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.229733 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.229746 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.332375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.332469 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.332520 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.332549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.332570 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.435821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.435879 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.435896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.435923 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.435940 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.538851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.538899 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.538910 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.538927 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.538939 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.641850 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.641906 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.641916 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.641928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.641958 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.744682 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.744756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.744799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.744836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.744858 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.848309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.848427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.848445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.848471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.848527 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.950572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.950621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.950631 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.950645 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:12 crc kubenswrapper[4956]: I0314 08:58:12.950656 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:12Z","lastTransitionTime":"2026-03-14T08:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.052539 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.052585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.052595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.052612 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.052622 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.155440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.155500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.155509 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.155522 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.155531 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.208974 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.209028 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.209051 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.209109 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.209174 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.209263 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.258450 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.258507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.258516 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.258533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.258542 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.360311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.360372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.360389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.360414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.360432 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.462960 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.462996 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.463004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.463016 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.463024 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.539183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.539261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.539278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.539333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.539356 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.560504 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.565827 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.565878 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.565894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.565915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.565931 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.586793 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.590829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.590869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.590881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.590898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.590909 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.603706 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.608316 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.608380 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.608404 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.608429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.608447 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.622278 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.626169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.626215 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.626227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.626245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.626258 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.643319 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:13 crc kubenswrapper[4956]: E0314 08:58:13.643498 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.644841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.644869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.644881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.644896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.644907 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.748461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.748578 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.748606 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.748638 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.748661 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.851122 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.851179 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.851195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.851221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.851241 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.953730 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.953803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.953825 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.953851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:13 crc kubenswrapper[4956]: I0314 08:58:13.953878 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:13Z","lastTransitionTime":"2026-03-14T08:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.058777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.059178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.059220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.059270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.059289 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.162315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.162363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.162387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.162409 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.162425 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.265268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.265338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.265361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.265390 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.265415 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.369879 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.369968 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.370000 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.370033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.370059 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.473100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.473172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.473192 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.473219 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.473238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.575906 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.575991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.576010 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.576039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.576060 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.679572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.679643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.679668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.679699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.679728 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.783100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.783180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.783217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.783252 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.783277 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.886020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.886061 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.886074 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.886089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.886100 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.972221 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.972363 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:14 crc kubenswrapper[4956]: E0314 08:58:14.972399 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:58:30.972364889 +0000 UTC m=+116.485057197 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.972455 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:14 crc kubenswrapper[4956]: E0314 08:58:14.972556 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:14 crc kubenswrapper[4956]: E0314 08:58:14.972641 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:30.972615005 +0000 UTC m=+116.485307313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:14 crc kubenswrapper[4956]: E0314 08:58:14.972692 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:14 crc kubenswrapper[4956]: E0314 08:58:14.972775 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:30.972752689 +0000 UTC m=+116.485444997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.989600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.989667 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.989692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.989723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:14 crc kubenswrapper[4956]: I0314 08:58:14.989747 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:14Z","lastTransitionTime":"2026-03-14T08:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.073630 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.073728 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.073883 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.073940 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.073962 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.074008 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.074044 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.074051 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:31.074026022 +0000 UTC m=+116.586718330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.074070 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.074157 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:31.074128854 +0000 UTC m=+116.586821202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.092655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.092712 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.092732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.092755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.092773 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.195094 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.195145 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.195161 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.195183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.195199 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.208783 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.208824 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.208959 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.209005 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.209193 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:15 crc kubenswrapper[4956]: E0314 08:58:15.209403 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.229428 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.249132 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.267864 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.287645 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.297733 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.297790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.297807 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.297831 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.297850 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.308149 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.331205 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.352530 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.399884 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.399926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.399937 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.399954 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.399967 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.502440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.502534 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.502553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.502579 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.502597 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.605394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.605567 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.605587 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.605614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.605637 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.708713 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.708779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.708802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.708830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.708854 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.811366 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.811435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.811452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.811471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.811518 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.914411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.914478 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.914525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.914550 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:15 crc kubenswrapper[4956]: I0314 08:58:15.914570 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:15Z","lastTransitionTime":"2026-03-14T08:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.017092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.017174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.017198 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.017234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.017253 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.120228 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.120292 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.120309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.120333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.120352 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.223137 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.223552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.223770 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.223949 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.224127 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.327047 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.327098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.327115 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.327138 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.327155 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.429735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.429839 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.429861 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.429887 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.429904 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.532738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.532810 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.532834 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.532862 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.532885 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.636374 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.636453 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.636477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.636552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.636579 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.740120 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.740188 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.740207 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.740230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.740246 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.842700 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.842760 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.842778 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.842803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.842821 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.944880 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.944971 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.944996 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.945032 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:16 crc kubenswrapper[4956]: I0314 08:58:16.945056 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:16Z","lastTransitionTime":"2026-03-14T08:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.048591 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.048677 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.048704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.048735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.048758 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.151099 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.151129 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.151138 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.151151 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.151159 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.209436 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:17 crc kubenswrapper[4956]: E0314 08:58:17.209598 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.209687 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.209760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:17 crc kubenswrapper[4956]: E0314 08:58:17.209815 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:17 crc kubenswrapper[4956]: E0314 08:58:17.209830 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.253923 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.253989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.254015 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.254043 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.254064 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.359126 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.359165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.359173 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.359186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.359196 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.462701 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.462779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.462803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.462832 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.462855 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.564854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.564926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.564950 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.564979 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.564999 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.667872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.667945 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.668013 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.668043 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.668071 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.771242 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.771307 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.771325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.771353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.771372 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.873942 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.874005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.874022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.874045 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.874062 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.976540 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.976608 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.976620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.976632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:17 crc kubenswrapper[4956]: I0314 08:58:17.976643 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:17Z","lastTransitionTime":"2026-03-14T08:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.079060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.079135 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.079159 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.079189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.079213 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.182859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.182896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.182904 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.182917 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.182926 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.286827 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.286886 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.286903 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.286927 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.286943 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.390117 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.390176 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.390193 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.390217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.390234 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.492600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.492687 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.492705 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.492728 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.492745 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.595079 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.595127 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.595142 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.595160 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.595175 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.698212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.698278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.698303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.698338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.698361 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.800812 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.800868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.800885 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.800908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.800927 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.904211 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.904279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.904295 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.904316 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:18 crc kubenswrapper[4956]: I0314 08:58:18.904333 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:18Z","lastTransitionTime":"2026-03-14T08:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.007685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.008013 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.008260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.008476 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.008707 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.111909 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.111979 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.112004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.112033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.112059 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.209177 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.209224 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:19 crc kubenswrapper[4956]: E0314 08:58:19.209303 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.209395 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:19 crc kubenswrapper[4956]: E0314 08:58:19.209554 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:19 crc kubenswrapper[4956]: E0314 08:58:19.209617 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.214186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.214242 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.214265 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.214290 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.214309 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.317120 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.317180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.317195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.317212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.317224 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.420358 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.420433 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.420454 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.420524 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.420556 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.523788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.523829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.523841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.523859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.523868 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.627359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.627438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.627463 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.627529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.627551 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.731812 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.731907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.731929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.731959 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.731991 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.835259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.835335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.835357 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.835381 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.835399 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.939084 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.939152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.939169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.939195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:19 crc kubenswrapper[4956]: I0314 08:58:19.939214 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:19Z","lastTransitionTime":"2026-03-14T08:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.041794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.041859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.041878 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.041900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.041922 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.145459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.145570 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.145593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.145627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.145650 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.236081 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.249848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.250361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.250564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.250758 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.250934 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.354210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.354755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.354919 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.355083 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.355253 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.459234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.459693 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.459943 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.460157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.460314 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.564610 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.565157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.565309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.565445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.565621 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.669185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.669234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.669248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.669266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.669279 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.772800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.772861 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.772873 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.772894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.772909 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.876576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.876653 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.876675 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.876704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.876724 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.980059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.980164 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.980183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.980213 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:20 crc kubenswrapper[4956]: I0314 08:58:20.980231 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:20Z","lastTransitionTime":"2026-03-14T08:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.084118 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.084195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.084215 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.084244 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.084263 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.188354 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.188423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.188440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.188471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.188552 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.209461 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.209616 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.209665 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:21 crc kubenswrapper[4956]: E0314 08:58:21.209796 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:21 crc kubenswrapper[4956]: E0314 08:58:21.210068 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:21 crc kubenswrapper[4956]: E0314 08:58:21.210282 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.291296 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.291344 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.291353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.291367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.291377 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.395287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.395350 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.395368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.395394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.395413 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.499116 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.499190 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.499202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.499226 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.499238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.602857 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.602934 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.602953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.602976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.602995 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.706545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.706624 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.706649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.706687 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.706711 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.810443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.810559 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.810582 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.810613 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.810636 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.914537 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.914595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.914612 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.914638 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:21 crc kubenswrapper[4956]: I0314 08:58:21.914660 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:21Z","lastTransitionTime":"2026-03-14T08:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.018270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.018319 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.018327 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.018344 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.018354 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.121429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.121525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.121545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.121572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.121591 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.225615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.225691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.225712 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.225744 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.225768 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.328820 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.328874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.328889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.328909 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.328925 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.431901 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.431961 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.431972 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.431995 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.432010 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.535033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.535106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.535131 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.535168 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.535195 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.638963 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.639035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.639053 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.639086 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.639107 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.742438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.742546 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.742566 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.742597 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.742619 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.846225 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.846300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.846312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.846335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.846349 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.950279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.950370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.950388 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.950419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:22 crc kubenswrapper[4956]: I0314 08:58:22.950440 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:22Z","lastTransitionTime":"2026-03-14T08:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.052720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.052771 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.052782 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.052800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.052811 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.155789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.155828 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.155839 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.155859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.155873 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.209359 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.209552 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.209608 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.209671 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.209826 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.209971 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.210786 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.259038 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.259099 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.259113 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.259141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.259159 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.361462 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.361509 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.361518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.361531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.361540 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.464230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.464315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.464325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.464366 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.464376 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.566833 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.566863 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.566870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.566886 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.566896 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.670042 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.670088 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.670104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.670126 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.670143 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.736958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.736989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.736998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.737012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.737022 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.747634 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:23Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.750757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.750779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.750787 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.750800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.750808 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.764136 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:23Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.766955 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.766982 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.766989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.767001 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.767010 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.778545 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:23Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.782749 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.782791 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.782803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.782819 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.782832 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.803404 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:23Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.807727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.807784 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.807801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.807826 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.807844 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.829525 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:23Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:23 crc kubenswrapper[4956]: E0314 08:58:23.829706 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.831541 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.831597 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.831614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.831635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.831652 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.935694 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.935757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.935774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.935800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.935819 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:23Z","lastTransitionTime":"2026-03-14T08:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.985666 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.987921 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9"} Mar 14 08:58:23 crc kubenswrapper[4956]: I0314 08:58:23.988359 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.004956 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.019945 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.037836 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.039023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.039087 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.039125 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.039171 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.039196 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.071323 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.097391 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.121845 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.140406 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.143080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.143118 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.143128 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.143146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.143159 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.159290 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.246137 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.246396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.246508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.246602 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.246689 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.349563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.349603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.349614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.349629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.349642 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.452829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.452929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.452956 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.452983 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.453002 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.556194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.556248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.556264 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.556288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.556305 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.658727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.658775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.658785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.658801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.658812 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.714099 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h264v"] Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.714865 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.718003 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.718054 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.718664 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.739172 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.761917 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.761966 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.761981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.762000 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.762013 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.763008 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.764449 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d7e96975-9562-47ec-9476-e593f8d6be98-hosts-file\") pod \"node-resolver-h264v\" (UID: \"d7e96975-9562-47ec-9476-e593f8d6be98\") " pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.764543 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mblhq\" (UniqueName: \"kubernetes.io/projected/d7e96975-9562-47ec-9476-e593f8d6be98-kube-api-access-mblhq\") pod \"node-resolver-h264v\" (UID: \"d7e96975-9562-47ec-9476-e593f8d6be98\") " pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.778204 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.803082 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.853452 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.864826 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d7e96975-9562-47ec-9476-e593f8d6be98-hosts-file\") pod \"node-resolver-h264v\" (UID: \"d7e96975-9562-47ec-9476-e593f8d6be98\") " pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.864871 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mblhq\" (UniqueName: \"kubernetes.io/projected/d7e96975-9562-47ec-9476-e593f8d6be98-kube-api-access-mblhq\") pod \"node-resolver-h264v\" (UID: \"d7e96975-9562-47ec-9476-e593f8d6be98\") " pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.864896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.864916 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.864928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.864967 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.865005 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.865028 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d7e96975-9562-47ec-9476-e593f8d6be98-hosts-file\") pod \"node-resolver-h264v\" (UID: \"d7e96975-9562-47ec-9476-e593f8d6be98\") " pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.867023 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.881380 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.884170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mblhq\" (UniqueName: \"kubernetes.io/projected/d7e96975-9562-47ec-9476-e593f8d6be98-kube-api-access-mblhq\") pod \"node-resolver-h264v\" (UID: \"d7e96975-9562-47ec-9476-e593f8d6be98\") " pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.894102 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.904259 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:24Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.968604 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.968648 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.968657 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.968675 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:24 crc kubenswrapper[4956]: I0314 08:58:24.968685 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:24Z","lastTransitionTime":"2026-03-14T08:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.034774 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h264v" Mar 14 08:58:25 crc kubenswrapper[4956]: W0314 08:58:25.049368 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e96975_9562_47ec_9476_e593f8d6be98.slice/crio-c894e61b6880fa94fe640631c303901d32e018584fd71b6f273f8b081ad32897 WatchSource:0}: Error finding container c894e61b6880fa94fe640631c303901d32e018584fd71b6f273f8b081ad32897: Status 404 returned error can't find the container with id c894e61b6880fa94fe640631c303901d32e018584fd71b6f273f8b081ad32897 Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.071856 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.072162 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.072177 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.072200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.072213 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.098536 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mxjrk"] Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.098911 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5xlw2"] Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.099439 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-sgnxb"] Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.099460 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.099710 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.099722 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.103351 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.103780 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.104264 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.104864 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.105292 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.105706 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.106762 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.106866 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.106933 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.107042 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.106879 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.107200 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.124768 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.152770 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-hostroot\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166660 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-daemon-config\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166691 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-socket-dir-parent\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166717 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2131e0d-5d1b-4913-8908-e15859b063a4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba20367-e506-422e-a846-eb1525cb3b94-proxy-tls\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166767 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ba20367-e506-422e-a846-eb1525cb3b94-mcd-auth-proxy-config\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166836 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-cni-multus\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166860 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-multus-certs\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9ba20367-e506-422e-a846-eb1525cb3b94-rootfs\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.166952 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zwx\" (UniqueName: \"kubernetes.io/projected/7528e098-09d4-436f-a32d-a0e82e76b8e0-kube-api-access-82zwx\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167011 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmg4\" (UniqueName: \"kubernetes.io/projected/b2131e0d-5d1b-4913-8908-e15859b063a4-kube-api-access-6kmg4\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167035 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-system-cni-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167055 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-cni-bin\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167126 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-cni-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167150 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-cnibin\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167183 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-netns\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167205 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-etc-kubernetes\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167231 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-k8s-cni-cncf-io\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167276 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-system-cni-dir\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167304 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-os-release\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167328 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167350 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7528e098-09d4-436f-a32d-a0e82e76b8e0-cni-binary-copy\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167401 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2131e0d-5d1b-4913-8908-e15859b063a4-cni-binary-copy\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczk8\" (UniqueName: \"kubernetes.io/projected/9ba20367-e506-422e-a846-eb1525cb3b94-kube-api-access-cczk8\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-conf-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167538 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-kubelet\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167574 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-cnibin\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167559 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.167597 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-os-release\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.174042 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.174070 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.174078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.174092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.174104 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.182117 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.201095 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.208663 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.208717 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.208672 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:25 crc kubenswrapper[4956]: E0314 08:58:25.208763 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:25 crc kubenswrapper[4956]: E0314 08:58:25.208858 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:25 crc kubenswrapper[4956]: E0314 08:58:25.208962 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.219231 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.230656 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.243350 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.259005 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zwx\" (UniqueName: \"kubernetes.io/projected/7528e098-09d4-436f-a32d-a0e82e76b8e0-kube-api-access-82zwx\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268635 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9ba20367-e506-422e-a846-eb1525cb3b94-rootfs\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmg4\" (UniqueName: \"kubernetes.io/projected/b2131e0d-5d1b-4913-8908-e15859b063a4-kube-api-access-6kmg4\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268684 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-system-cni-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268704 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-cni-bin\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268726 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-netns\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268751 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-etc-kubernetes\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268780 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-cni-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268802 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-cnibin\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268822 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-system-cni-dir\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268837 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-system-cni-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268843 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-os-release\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268879 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268895 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7528e098-09d4-436f-a32d-a0e82e76b8e0-cni-binary-copy\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268898 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-os-release\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268910 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-k8s-cni-cncf-io\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268934 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2131e0d-5d1b-4913-8908-e15859b063a4-cni-binary-copy\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268951 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczk8\" (UniqueName: \"kubernetes.io/projected/9ba20367-e506-422e-a846-eb1525cb3b94-kube-api-access-cczk8\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268963 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-etc-kubernetes\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268976 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-conf-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269043 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-cnibin\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269054 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-kubelet\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269082 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-kubelet\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269541 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-cni-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269165 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9ba20367-e506-422e-a846-eb1525cb3b94-rootfs\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-netns\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.268948 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-cni-bin\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269236 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-k8s-cni-cncf-io\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269121 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-system-cni-dir\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.269754 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-conf-dir\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270102 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2131e0d-5d1b-4913-8908-e15859b063a4-cni-binary-copy\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270109 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-os-release\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270309 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7528e098-09d4-436f-a32d-a0e82e76b8e0-cni-binary-copy\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270512 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-os-release\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-cnibin\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270624 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-hostroot\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270647 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-daemon-config\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270668 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-socket-dir-parent\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270725 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-hostroot\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270742 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2131e0d-5d1b-4913-8908-e15859b063a4-cnibin\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270884 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-socket-dir-parent\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270926 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba20367-e506-422e-a846-eb1525cb3b94-proxy-tls\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.270981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ba20367-e506-422e-a846-eb1525cb3b94-mcd-auth-proxy-config\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271006 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-cni-multus\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271036 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2131e0d-5d1b-4913-8908-e15859b063a4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271063 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-multus-certs\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271149 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-run-multus-certs\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271320 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7528e098-09d4-436f-a32d-a0e82e76b8e0-multus-daemon-config\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271385 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7528e098-09d4-436f-a32d-a0e82e76b8e0-host-var-lib-cni-multus\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271916 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ba20367-e506-422e-a846-eb1525cb3b94-mcd-auth-proxy-config\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.271929 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2131e0d-5d1b-4913-8908-e15859b063a4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.274823 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.274931 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba20367-e506-422e-a846-eb1525cb3b94-proxy-tls\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.276221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.276251 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.276261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.276277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.276298 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.288225 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zwx\" (UniqueName: \"kubernetes.io/projected/7528e098-09d4-436f-a32d-a0e82e76b8e0-kube-api-access-82zwx\") pod \"multus-sgnxb\" (UID: \"7528e098-09d4-436f-a32d-a0e82e76b8e0\") " pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.288432 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczk8\" (UniqueName: \"kubernetes.io/projected/9ba20367-e506-422e-a846-eb1525cb3b94-kube-api-access-cczk8\") pod \"machine-config-daemon-mxjrk\" (UID: \"9ba20367-e506-422e-a846-eb1525cb3b94\") " pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.290511 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmg4\" (UniqueName: \"kubernetes.io/projected/b2131e0d-5d1b-4913-8908-e15859b063a4-kube-api-access-6kmg4\") pod \"multus-additional-cni-plugins-5xlw2\" (UID: \"b2131e0d-5d1b-4913-8908-e15859b063a4\") " pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.292416 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.308604 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.322573 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.339055 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.349560 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.359168 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.370123 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.378454 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.378502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.378514 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.378531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.378543 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.380724 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.389255 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.399520 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.411607 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.422778 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.423825 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.438675 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.439207 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgnxb" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.451024 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" Mar 14 08:58:25 crc kubenswrapper[4956]: W0314 08:58:25.453946 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7528e098_09d4_436f_a32d_a0e82e76b8e0.slice/crio-d3d67cf86f270f758f589b7eff4eb1fc6e7dea5ee4cce4f5ff949c733977677e WatchSource:0}: Error finding container d3d67cf86f270f758f589b7eff4eb1fc6e7dea5ee4cce4f5ff949c733977677e: Status 404 returned error can't find the container with id d3d67cf86f270f758f589b7eff4eb1fc6e7dea5ee4cce4f5ff949c733977677e Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.455013 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.467316 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.476226 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.480988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.481051 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.481067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.481092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.481105 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.484363 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj4pg"] Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.485966 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: W0314 08:58:25.486540 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2131e0d_5d1b_4913_8908_e15859b063a4.slice/crio-3f8b8f978052204a48bb37cbb6efc60d00948ef8551c665f098c627bd787e57e WatchSource:0}: Error finding container 3f8b8f978052204a48bb37cbb6efc60d00948ef8551c665f098c627bd787e57e: Status 404 returned error can't find the container with id 3f8b8f978052204a48bb37cbb6efc60d00948ef8551c665f098c627bd787e57e Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.491306 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.491462 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.491788 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.492154 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.492367 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.492472 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.492534 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.493747 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.506043 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.524721 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.542134 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.555145 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.571453 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.573663 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.573694 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-etc-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.573962 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-slash\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.573995 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-bin\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574020 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-config\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574045 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkxwz\" (UniqueName: \"kubernetes.io/projected/57d4b4cb-2115-421e-8f2a-491ec851328c-kube-api-access-wkxwz\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574129 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-systemd\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574186 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-env-overrides\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574225 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-script-lib\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574304 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-netd\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574328 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-kubelet\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574349 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-netns\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574406 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574458 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-ovn-kubernetes\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574586 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57d4b4cb-2115-421e-8f2a-491ec851328c-ovn-node-metrics-cert\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574641 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-node-log\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574678 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-ovn\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574714 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-var-lib-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574733 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-log-socket\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.574756 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-systemd-units\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.583731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.583760 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.583768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.583781 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.583791 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.590636 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.603730 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.617393 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.630590 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.650966 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.668564 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675739 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-ovn-kubernetes\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675790 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57d4b4cb-2115-421e-8f2a-491ec851328c-ovn-node-metrics-cert\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675813 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-ovn\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675844 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-node-log\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675855 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-ovn-kubernetes\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675927 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-node-log\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675930 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-var-lib-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-var-lib-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675984 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-systemd-units\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676002 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-log-socket\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.675987 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-ovn\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676018 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-etc-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676057 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkxwz\" (UniqueName: \"kubernetes.io/projected/57d4b4cb-2115-421e-8f2a-491ec851328c-kube-api-access-wkxwz\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676063 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-log-socket\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676073 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-slash\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-bin\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676094 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-systemd-units\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676106 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-config\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676124 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-env-overrides\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676141 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-systemd\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-script-lib\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676190 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-netd\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676206 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-kubelet\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676221 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-netns\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676244 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676284 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676306 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-bin\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676318 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-slash\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676357 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-kubelet\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676394 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-etc-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676396 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-netd\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676401 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-systemd\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676424 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-netns\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676441 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-openvswitch\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.676966 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-script-lib\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.677003 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-env-overrides\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.677073 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-config\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.680841 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.680894 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57d4b4cb-2115-421e-8f2a-491ec851328c-ovn-node-metrics-cert\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.686114 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.686220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.686233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.686260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.686274 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.695696 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.699839 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkxwz\" (UniqueName: \"kubernetes.io/projected/57d4b4cb-2115-421e-8f2a-491ec851328c-kube-api-access-wkxwz\") pod \"ovnkube-node-qj4pg\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.712272 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.726381 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.745141 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.763954 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.781586 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.788855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.788911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.788924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.788947 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.788962 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.798025 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.819223 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.827100 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.845807 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.892559 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.892620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.892637 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.892661 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.892678 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.993943 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.993989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.994004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.994028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.994047 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:25Z","lastTransitionTime":"2026-03-14T08:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.996448 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.996547 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.996565 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"c227e61e512ff5ae3aaa09b803bc00970af23808d0e4bd9953f56d73af4b1775"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.998254 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h264v" event={"ID":"d7e96975-9562-47ec-9476-e593f8d6be98","Type":"ContainerStarted","Data":"465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331"} Mar 14 08:58:25 crc kubenswrapper[4956]: I0314 08:58:25.998301 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h264v" event={"ID":"d7e96975-9562-47ec-9476-e593f8d6be98","Type":"ContainerStarted","Data":"c894e61b6880fa94fe640631c303901d32e018584fd71b6f273f8b081ad32897"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.000931 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" exitCode=0 Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.001005 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.001196 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"ac6d0c7d195480bd7fed164793aa36cad36f6abe33425c6eb81abc79f7c91832"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.002839 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgnxb" event={"ID":"7528e098-09d4-436f-a32d-a0e82e76b8e0","Type":"ContainerStarted","Data":"cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.002892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgnxb" event={"ID":"7528e098-09d4-436f-a32d-a0e82e76b8e0","Type":"ContainerStarted","Data":"d3d67cf86f270f758f589b7eff4eb1fc6e7dea5ee4cce4f5ff949c733977677e"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.005785 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2131e0d-5d1b-4913-8908-e15859b063a4" containerID="c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b" exitCode=0 Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.005820 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerDied","Data":"c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.005840 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerStarted","Data":"3f8b8f978052204a48bb37cbb6efc60d00948ef8551c665f098c627bd787e57e"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.019449 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.040612 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.069448 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.089082 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.097400 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.097636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.097782 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.097957 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.099051 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.108361 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.135471 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.158779 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.180773 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.197758 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.201434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.201460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.201468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.201495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.201505 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.211659 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.226576 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.245638 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.259796 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.275102 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.288685 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.303653 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.304841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.304881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.304895 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.304914 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.304926 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.318089 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.332594 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.349728 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.372096 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.387659 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.413143 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.413187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.413197 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.413221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.413232 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.424431 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.471277 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.511889 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.515725 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.515775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.515793 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.515819 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.515837 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.544596 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.591206 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.618418 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.618455 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.618465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.618500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.618513 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.722624 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.723179 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.723194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.723225 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.723242 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.827563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.827621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.827635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.827658 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.827675 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.934412 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.934511 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.934532 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.934562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:26 crc kubenswrapper[4956]: I0314 08:58:26.934588 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:26Z","lastTransitionTime":"2026-03-14T08:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.013658 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2131e0d-5d1b-4913-8908-e15859b063a4" containerID="825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6" exitCode=0 Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.013733 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerDied","Data":"825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.018971 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.019044 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.019076 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.019101 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.030575 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.036931 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.036994 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.037006 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.037028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.037045 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.046562 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.059163 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.072804 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.085136 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.100105 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.115635 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.126619 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.139945 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.139978 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.139987 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.140000 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.140010 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.146375 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.159303 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.170803 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.190129 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.204766 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.209275 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:27 crc kubenswrapper[4956]: E0314 08:58:27.209388 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.209432 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.209613 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:27 crc kubenswrapper[4956]: E0314 08:58:27.209802 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:27 crc kubenswrapper[4956]: E0314 08:58:27.209870 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.222852 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.242313 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.242348 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.242356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.242369 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.242379 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.344671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.344700 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.344708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.344722 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.344733 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.446384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.446423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.446433 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.446448 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.446459 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.548828 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.548897 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.548920 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.548952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.548977 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.651425 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.651504 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.651515 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.651528 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.651538 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.755033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.755112 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.755137 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.755167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.755184 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.858793 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.858850 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.858867 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.858891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.858909 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.962058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.962105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.962116 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.962134 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:27 crc kubenswrapper[4956]: I0314 08:58:27.962149 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:27Z","lastTransitionTime":"2026-03-14T08:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.029024 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.029121 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.032810 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2131e0d-5d1b-4913-8908-e15859b063a4" containerID="66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1" exitCode=0 Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.032947 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerDied","Data":"66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.059218 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.066065 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.066132 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.066154 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.066202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.066224 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.078000 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.102075 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.119828 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.135706 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.150565 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.168266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.168296 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.168306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.168323 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.168337 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.168515 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.191318 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.202508 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.216703 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.232145 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.247262 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.257877 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.271251 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.271289 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.271299 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.271315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.271327 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.274424 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:28Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.374849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.374911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.374929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.374990 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.375011 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.477967 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.478008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.478022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.478043 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.478058 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.582752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.582812 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.582829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.582860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.582879 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.685626 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.685704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.685742 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.685769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.685788 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.788646 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.788715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.788740 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.788772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.788790 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.892201 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.892293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.892308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.892332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.892348 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.995441 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.995535 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.995554 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.995578 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:28 crc kubenswrapper[4956]: I0314 08:58:28.995597 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:28Z","lastTransitionTime":"2026-03-14T08:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.041211 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2131e0d-5d1b-4913-8908-e15859b063a4" containerID="f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818" exitCode=0 Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.041280 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerDied","Data":"f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.066909 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.088248 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.100045 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.100093 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.100104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.100121 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.100133 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.126270 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.150329 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.167521 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.189592 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.203451 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.203510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.203519 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.203535 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.203547 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.207059 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.209609 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.209710 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:29 crc kubenswrapper[4956]: E0314 08:58:29.210345 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:29 crc kubenswrapper[4956]: E0314 08:58:29.209924 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.209626 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:29 crc kubenswrapper[4956]: E0314 08:58:29.210709 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.226590 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.245463 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.265691 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.289103 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.306933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.306969 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.306980 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.306998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.307010 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.318743 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.339315 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.355554 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:29Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.409025 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.409088 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.409105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.409130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.409152 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.511885 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.511925 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.511936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.511951 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.511963 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.614325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.614361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.614369 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.614384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.614393 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.717741 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.717802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.717820 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.717846 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.717866 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.820570 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.820649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.820678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.820710 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.820736 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.923796 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.923871 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.923911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.923944 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:29 crc kubenswrapper[4956]: I0314 08:58:29.923967 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:29Z","lastTransitionTime":"2026-03-14T08:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.027278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.027339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.027362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.027389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.027411 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.048382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.051740 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2131e0d-5d1b-4913-8908-e15859b063a4" containerID="76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5" exitCode=0 Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.051801 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerDied","Data":"76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.087142 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.109610 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.130019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.130078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.130097 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.130123 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.130141 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.133456 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.159321 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.178652 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.191474 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.211253 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.222596 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.232449 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.232547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.232558 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.232570 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.232579 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.234325 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.250930 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.261864 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.273506 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.285665 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.302037 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:30Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.334134 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.334167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.334178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.334192 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.334204 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.436611 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.436651 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.436659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.436673 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.436681 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.540252 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.540319 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.540333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.540357 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.540371 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.643047 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.643112 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.643131 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.643157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.643214 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.749127 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.749236 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.749265 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.749298 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.749317 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.853064 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.853116 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.853135 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.853157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.853172 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.957329 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.957395 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.957412 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.957436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:30 crc kubenswrapper[4956]: I0314 08:58:30.957458 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:30Z","lastTransitionTime":"2026-03-14T08:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.035671 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.035849 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.035889 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:03.035860274 +0000 UTC m=+148.548552552 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.035944 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.035971 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.036060 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:59:03.036038318 +0000 UTC m=+148.548730626 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.036067 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.036104 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:59:03.03609531 +0000 UTC m=+148.548787588 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.060398 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.060461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.060513 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.060596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.060617 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.061747 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2131e0d-5d1b-4913-8908-e15859b063a4" containerID="4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0" exitCode=0 Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.061807 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerDied","Data":"4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.118056 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.138310 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.138604 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.138659 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.138679 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.138754 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:59:03.138730675 +0000 UTC m=+148.651422973 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.139182 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.139275 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.139306 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.139320 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.139376 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:59:03.13936039 +0000 UTC m=+148.652052678 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.143321 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.159337 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.165788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.165822 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.165834 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.165849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.165860 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.176923 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.193073 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.204418 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.208558 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.208673 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.208741 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.208751 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.208883 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:31 crc kubenswrapper[4956]: E0314 08:58:31.208974 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.231831 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.242464 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.253386 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.264674 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.267768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.267821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.267837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.267857 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.267876 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.281182 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.297945 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.314666 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.331352 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.369769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.369813 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.369825 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.369844 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.369856 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.473367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.473410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.473421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.473437 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.473450 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.541026 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gqfh2"] Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.541588 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.544375 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.544821 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.545171 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.545461 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.562256 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.577125 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.577184 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.577208 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.577240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.577262 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.581350 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.612934 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.629451 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.643083 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.643388 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7535708d-1eec-4d4c-b0eb-f5343af71b3d-serviceca\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.643467 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7535708d-1eec-4d4c-b0eb-f5343af71b3d-host\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.643573 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rs6d\" (UniqueName: \"kubernetes.io/projected/7535708d-1eec-4d4c-b0eb-f5343af71b3d-kube-api-access-9rs6d\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.668170 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.679973 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.680010 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.680023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.680044 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.680057 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.688356 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.702518 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.715611 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.727180 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.737014 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.744986 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rs6d\" (UniqueName: \"kubernetes.io/projected/7535708d-1eec-4d4c-b0eb-f5343af71b3d-kube-api-access-9rs6d\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.745085 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7535708d-1eec-4d4c-b0eb-f5343af71b3d-serviceca\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.745157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7535708d-1eec-4d4c-b0eb-f5343af71b3d-host\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.745242 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7535708d-1eec-4d4c-b0eb-f5343af71b3d-host\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.746915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7535708d-1eec-4d4c-b0eb-f5343af71b3d-serviceca\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.748805 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.761169 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rs6d\" (UniqueName: \"kubernetes.io/projected/7535708d-1eec-4d4c-b0eb-f5343af71b3d-kube-api-access-9rs6d\") pod \"node-ca-gqfh2\" (UID: \"7535708d-1eec-4d4c-b0eb-f5343af71b3d\") " pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.761445 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.777023 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.781805 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.781849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.781866 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.781922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.781940 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.789441 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.858162 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gqfh2" Mar 14 08:58:31 crc kubenswrapper[4956]: W0314 08:58:31.868757 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7535708d_1eec_4d4c_b0eb_f5343af71b3d.slice/crio-270c00389549f6c28c322bffb5304bb27ce89e9da3751a2e2d94820fce8f33de WatchSource:0}: Error finding container 270c00389549f6c28c322bffb5304bb27ce89e9da3751a2e2d94820fce8f33de: Status 404 returned error can't find the container with id 270c00389549f6c28c322bffb5304bb27ce89e9da3751a2e2d94820fce8f33de Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.883846 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.883876 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.883884 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.883898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.883907 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.986452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.986475 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.986512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.986525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:31 crc kubenswrapper[4956]: I0314 08:58:31.986534 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:31Z","lastTransitionTime":"2026-03-14T08:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.065068 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gqfh2" event={"ID":"7535708d-1eec-4d4c-b0eb-f5343af71b3d","Type":"ContainerStarted","Data":"270c00389549f6c28c322bffb5304bb27ce89e9da3751a2e2d94820fce8f33de"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.068340 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.068605 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.068673 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.071618 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" event={"ID":"b2131e0d-5d1b-4913-8908-e15859b063a4","Type":"ContainerStarted","Data":"9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.088979 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.089007 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.089015 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.089031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.089039 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.093019 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.104527 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.115092 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.116400 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.129644 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.139300 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.146806 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.169014 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.181424 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.190525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.190580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.190590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.190605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.190615 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.192076 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.202460 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.213096 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.223510 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.234041 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.247428 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.257254 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.275332 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.292419 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.293146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.293174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.293181 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.293195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.293204 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.301454 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.317697 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.328356 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.338280 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.345754 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.356032 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.364795 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.375899 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.385321 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.395586 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.395622 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.395635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.395654 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.395667 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.397581 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.405961 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.416714 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.428113 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.498010 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.498044 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.498053 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.498066 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.498076 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.601560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.601635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.601658 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.601686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.601708 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.704146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.704231 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.704249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.704278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.704298 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.807625 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.807699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.807723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.807755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.807777 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.910437 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.910538 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.910562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.910592 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:32 crc kubenswrapper[4956]: I0314 08:58:32.910616 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:32Z","lastTransitionTime":"2026-03-14T08:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.012573 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.012636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.012648 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.012666 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.012678 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.075940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gqfh2" event={"ID":"7535708d-1eec-4d4c-b0eb-f5343af71b3d","Type":"ContainerStarted","Data":"2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.076294 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.094603 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.100768 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.104496 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.115138 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.115174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.115196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.115215 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.115228 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.119308 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.132712 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.149976 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.162585 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.172619 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.195139 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.208209 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.208729 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.208813 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.208936 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:33 crc kubenswrapper[4956]: E0314 08:58:33.208845 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:33 crc kubenswrapper[4956]: E0314 08:58:33.209057 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:33 crc kubenswrapper[4956]: E0314 08:58:33.209107 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.217574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.217622 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.217633 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.217651 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.217666 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.221406 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.236294 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.252205 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.266459 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.278890 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.291467 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.304572 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.315972 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.321828 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.321860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.321874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.321891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.321903 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.328698 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.345641 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.358884 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.369419 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.382046 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.399216 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.411518 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.424390 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.424438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.424452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.424470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.424500 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.424591 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.439414 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.448781 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.465907 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.476687 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.488284 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.527691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.527740 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.527756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.527777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.527792 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.630390 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.630437 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.630452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.630475 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.630516 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.733551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.733616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.733632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.733659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.733677 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.836318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.836378 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.836391 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.836408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.836424 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.938572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.938617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.938627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.938644 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:33 crc kubenswrapper[4956]: I0314 08:58:33.938658 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:33Z","lastTransitionTime":"2026-03-14T08:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.040819 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.040860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.040869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.040885 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.040895 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.089912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.089954 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.089966 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.089981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.089991 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: E0314 08:58:34.103719 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:34Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.107396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.107431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.107442 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.107460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.107472 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: E0314 08:58:34.120205 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:34Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.124851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.124891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.124904 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.124924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.124936 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: E0314 08:58:34.142235 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:34Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.146422 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.146462 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.146492 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.146508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.146518 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: E0314 08:58:34.161012 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:34Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.165907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.165944 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.165958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.165974 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.165985 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: E0314 08:58:34.177836 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:34Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:34 crc kubenswrapper[4956]: E0314 08:58:34.178162 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.179700 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.179737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.179746 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.179761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.179809 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.282287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.282312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.282338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.282376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.282386 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.384985 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.385049 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.385082 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.385106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.385126 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.488037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.488280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.488293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.488309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.488321 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.590981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.591046 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.591067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.591091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.591110 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.693110 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.693179 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.693203 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.693229 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.693246 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.796434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.796534 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.796561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.796591 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.796612 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.900152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.900191 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.900202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.900217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:34 crc kubenswrapper[4956]: I0314 08:58:34.900229 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:34Z","lastTransitionTime":"2026-03-14T08:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.002806 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.002855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.002867 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.002883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.002895 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:35Z","lastTransitionTime":"2026-03-14T08:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.084940 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/0.log" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.088073 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94" exitCode=1 Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.088118 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94"} Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.088803 4956 scope.go:117] "RemoveContainer" containerID="fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.105130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.105165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.105175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.105191 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.105202 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:35Z","lastTransitionTime":"2026-03-14T08:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.119844 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:58:34.385216 6839 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:58:34.385376 6839 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:58:34.385814 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0314 08:58:34.385842 6839 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:58:34.385848 6839 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 08:58:34.385868 6839 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 08:58:34.385883 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:58:34.385905 6839 factory.go:656] Stopping watch factory\\\\nI0314 08:58:34.385921 6839 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:58:34.385933 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0314 08:58:34.385940 6839 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0314 08:58:34.385948 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:58:34.385957 6839 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.135614 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.148653 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.166690 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.182037 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.195282 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: E0314 08:58:35.205364 4956 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.208475 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.208565 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.208500 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:35 crc kubenswrapper[4956]: E0314 08:58:35.208609 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:35 crc kubenswrapper[4956]: E0314 08:58:35.208722 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:35 crc kubenswrapper[4956]: E0314 08:58:35.208834 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.209170 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.226186 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.241562 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.249982 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.265638 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.285341 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.301866 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.315456 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.326692 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: E0314 08:58:35.329220 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.339974 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.352385 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.361395 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.382319 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:58:34.385216 6839 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:58:34.385376 6839 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:58:34.385814 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0314 08:58:34.385842 6839 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:58:34.385848 6839 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 08:58:34.385868 6839 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 08:58:34.385883 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:58:34.385905 6839 factory.go:656] Stopping watch factory\\\\nI0314 08:58:34.385921 6839 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:58:34.385933 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0314 08:58:34.385940 6839 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0314 08:58:34.385948 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:58:34.385957 6839 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.394141 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.405349 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.416061 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.426595 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.440429 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.455133 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.470291 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.481830 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.501243 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.514846 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:35 crc kubenswrapper[4956]: I0314 08:58:35.527922 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.092934 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/1.log" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.093384 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/0.log" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.099281 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326" exitCode=1 Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.099349 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326"} Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.099404 4956 scope.go:117] "RemoveContainer" containerID="fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.100033 4956 scope.go:117] "RemoveContainer" containerID="bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326" Mar 14 08:58:36 crc kubenswrapper[4956]: E0314 08:58:36.100354 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.113473 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.127284 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.138896 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.158560 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc1f6769692baf11365bc50844ae83f545502529a4c6971a50552856faffa94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:34Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:58:34.385216 6839 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:58:34.385376 6839 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:58:34.385814 6839 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0314 08:58:34.385842 6839 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:58:34.385848 6839 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 08:58:34.385868 6839 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 08:58:34.385883 6839 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:58:34.385905 6839 factory.go:656] Stopping watch factory\\\\nI0314 08:58:34.385921 6839 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:58:34.385933 6839 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0314 08:58:34.385940 6839 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0314 08:58:34.385948 6839 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:58:34.385957 6839 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.170284 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.186925 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.199828 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.211752 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.225828 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.239563 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.255343 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.265989 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.288413 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.300973 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:36 crc kubenswrapper[4956]: I0314 08:58:36.311755 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.104509 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/1.log" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.108768 4956 scope.go:117] "RemoveContainer" containerID="bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326" Mar 14 08:58:37 crc kubenswrapper[4956]: E0314 08:58:37.108998 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.123175 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.135349 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.145605 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.158293 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.170264 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.186068 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.199036 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.209102 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.209144 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.209143 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:37 crc kubenswrapper[4956]: E0314 08:58:37.209239 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:37 crc kubenswrapper[4956]: E0314 08:58:37.209376 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:37 crc kubenswrapper[4956]: E0314 08:58:37.209550 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.213509 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.226896 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.260475 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.274794 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.291004 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.308806 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.329370 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.348733 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.484666 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68"] Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.485517 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.489503 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.490743 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.506716 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.541957 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.570619 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.589369 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.605147 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.605225 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.605291 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftc8g\" (UniqueName: \"kubernetes.io/projected/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-kube-api-access-ftc8g\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.605329 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.621372 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.646457 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.672386 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.696153 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.706463 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.706557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.706622 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.706659 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftc8g\" (UniqueName: \"kubernetes.io/projected/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-kube-api-access-ftc8g\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.707559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.708049 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.717104 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.719928 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.737637 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftc8g\" (UniqueName: \"kubernetes.io/projected/fba4fa81-8b34-4c43-aef8-84072b0bc8fb-kube-api-access-ftc8g\") pod \"ovnkube-control-plane-749d76644c-mdp68\" (UID: \"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.740331 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.764125 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.785886 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.800873 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.816555 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: W0314 08:58:37.830113 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba4fa81_8b34_4c43_aef8_84072b0bc8fb.slice/crio-aa372f3f576c8a2c041110c3e8a2c4d4f9303b9b7b3c901c861c50bc261b4526 WatchSource:0}: Error finding container aa372f3f576c8a2c041110c3e8a2c4d4f9303b9b7b3c901c861c50bc261b4526: Status 404 returned error can't find the container with id aa372f3f576c8a2c041110c3e8a2c4d4f9303b9b7b3c901c861c50bc261b4526 Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.838124 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.860429 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:37 crc kubenswrapper[4956]: I0314 08:58:37.887992 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.115277 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" event={"ID":"fba4fa81-8b34-4c43-aef8-84072b0bc8fb","Type":"ContainerStarted","Data":"aa372f3f576c8a2c041110c3e8a2c4d4f9303b9b7b3c901c861c50bc261b4526"} Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.431918 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-42pn5"] Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.432893 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:38 crc kubenswrapper[4956]: E0314 08:58:38.433022 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.476103 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.500266 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.518624 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.518665 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5d9l\" (UniqueName: \"kubernetes.io/projected/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-kube-api-access-x5d9l\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.520939 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.541027 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.557712 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.569600 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.589993 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.604056 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.619947 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5d9l\" (UniqueName: \"kubernetes.io/projected/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-kube-api-access-x5d9l\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.620056 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:38 crc kubenswrapper[4956]: E0314 08:58:38.620291 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:38 crc kubenswrapper[4956]: E0314 08:58:38.620426 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:39.120396802 +0000 UTC m=+124.633089090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.621007 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.639057 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.650307 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5d9l\" (UniqueName: \"kubernetes.io/projected/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-kube-api-access-x5d9l\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.657895 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.680065 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.692530 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.708138 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.723443 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.736777 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:38 crc kubenswrapper[4956]: I0314 08:58:38.749817 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.121100 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" event={"ID":"fba4fa81-8b34-4c43-aef8-84072b0bc8fb","Type":"ContainerStarted","Data":"c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43"} Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.121151 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" event={"ID":"fba4fa81-8b34-4c43-aef8-84072b0bc8fb","Type":"ContainerStarted","Data":"cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124"} Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.125094 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:39 crc kubenswrapper[4956]: E0314 08:58:39.125255 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:39 crc kubenswrapper[4956]: E0314 08:58:39.125327 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:40.125305132 +0000 UTC m=+125.637997440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.140561 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.165016 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.182255 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.198884 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.208641 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.208646 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:39 crc kubenswrapper[4956]: E0314 08:58:39.208780 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.208818 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:39 crc kubenswrapper[4956]: E0314 08:58:39.208903 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:39 crc kubenswrapper[4956]: E0314 08:58:39.209191 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.216934 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.235193 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.257819 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.274732 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.288796 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.300562 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.327326 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.345566 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.362195 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.378806 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.394260 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.415762 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:39 crc kubenswrapper[4956]: I0314 08:58:39.441272 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:40 crc kubenswrapper[4956]: I0314 08:58:40.137230 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:40 crc kubenswrapper[4956]: E0314 08:58:40.137448 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:40 crc kubenswrapper[4956]: E0314 08:58:40.137799 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:42.137766775 +0000 UTC m=+127.650459083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:40 crc kubenswrapper[4956]: I0314 08:58:40.208862 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:40 crc kubenswrapper[4956]: E0314 08:58:40.209125 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:40 crc kubenswrapper[4956]: E0314 08:58:40.330621 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.209304 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.209304 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:41 crc kubenswrapper[4956]: E0314 08:58:41.209636 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.209389 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:41 crc kubenswrapper[4956]: E0314 08:58:41.209708 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:41 crc kubenswrapper[4956]: E0314 08:58:41.209965 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.669429 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.695435 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.715622 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.738237 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.759017 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.781629 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.796423 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.828268 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.847360 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.872177 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.898651 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.916350 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.937475 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.958842 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.978358 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:41 crc kubenswrapper[4956]: I0314 08:58:41.994637 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:41Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:42 crc kubenswrapper[4956]: I0314 08:58:42.011645 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:42Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:42 crc kubenswrapper[4956]: I0314 08:58:42.032277 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:42Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:42 crc kubenswrapper[4956]: I0314 08:58:42.161176 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:42 crc kubenswrapper[4956]: E0314 08:58:42.161423 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:42 crc kubenswrapper[4956]: E0314 08:58:42.161556 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:46.161522923 +0000 UTC m=+131.674215241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:42 crc kubenswrapper[4956]: I0314 08:58:42.209162 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:42 crc kubenswrapper[4956]: E0314 08:58:42.209375 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:43 crc kubenswrapper[4956]: I0314 08:58:43.209269 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:43 crc kubenswrapper[4956]: I0314 08:58:43.209300 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:43 crc kubenswrapper[4956]: E0314 08:58:43.209443 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:43 crc kubenswrapper[4956]: I0314 08:58:43.209515 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:43 crc kubenswrapper[4956]: E0314 08:58:43.209570 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:43 crc kubenswrapper[4956]: E0314 08:58:43.209703 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.208357 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.208647 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.476040 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.476116 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.476130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.476150 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.476164 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:44Z","lastTransitionTime":"2026-03-14T08:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.497031 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:44Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.502466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.502537 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.502551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.502576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.502596 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:44Z","lastTransitionTime":"2026-03-14T08:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.521291 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:44Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.526547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.526603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.526617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.526637 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.526650 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:44Z","lastTransitionTime":"2026-03-14T08:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.542044 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:44Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.546993 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.547045 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.547056 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.547075 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.547091 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:44Z","lastTransitionTime":"2026-03-14T08:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.567190 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:44Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.571387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.571457 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.571468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.571517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:44 crc kubenswrapper[4956]: I0314 08:58:44.571538 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:44Z","lastTransitionTime":"2026-03-14T08:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.586900 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:44Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:44 crc kubenswrapper[4956]: E0314 08:58:44.587146 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.208661 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:45 crc kubenswrapper[4956]: E0314 08:58:45.208881 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.208930 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.209052 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:45 crc kubenswrapper[4956]: E0314 08:58:45.209227 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:45 crc kubenswrapper[4956]: E0314 08:58:45.209425 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.234580 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.262900 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.281932 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.301667 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.319190 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: E0314 08:58:45.331476 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.335669 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.355078 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.380828 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.395564 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.413747 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.445694 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.470374 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.483784 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.502725 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.516370 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.528119 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:45 crc kubenswrapper[4956]: I0314 08:58:45.550127 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:46 crc kubenswrapper[4956]: I0314 08:58:46.209033 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:46 crc kubenswrapper[4956]: E0314 08:58:46.209749 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:46 crc kubenswrapper[4956]: I0314 08:58:46.210256 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:46 crc kubenswrapper[4956]: E0314 08:58:46.210703 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:46 crc kubenswrapper[4956]: E0314 08:58:46.211035 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:58:54.211005445 +0000 UTC m=+139.723697753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:47 crc kubenswrapper[4956]: I0314 08:58:47.208762 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:47 crc kubenswrapper[4956]: I0314 08:58:47.208827 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:47 crc kubenswrapper[4956]: I0314 08:58:47.208836 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:47 crc kubenswrapper[4956]: E0314 08:58:47.208918 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:47 crc kubenswrapper[4956]: E0314 08:58:47.209000 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:47 crc kubenswrapper[4956]: E0314 08:58:47.209196 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:47 crc kubenswrapper[4956]: I0314 08:58:47.224776 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 14 08:58:48 crc kubenswrapper[4956]: I0314 08:58:48.209290 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:48 crc kubenswrapper[4956]: E0314 08:58:48.209559 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:49 crc kubenswrapper[4956]: I0314 08:58:49.208694 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:49 crc kubenswrapper[4956]: E0314 08:58:49.208939 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:49 crc kubenswrapper[4956]: I0314 08:58:49.208990 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:49 crc kubenswrapper[4956]: I0314 08:58:49.209052 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:49 crc kubenswrapper[4956]: E0314 08:58:49.209702 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:49 crc kubenswrapper[4956]: E0314 08:58:49.209768 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:49 crc kubenswrapper[4956]: I0314 08:58:49.210328 4956 scope.go:117] "RemoveContainer" containerID="bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.169683 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/1.log" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.175246 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.175970 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.195102 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.208790 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:50 crc kubenswrapper[4956]: E0314 08:58:50.209045 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.212299 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.223278 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.239219 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.252940 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.273979 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.284557 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.299787 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.312370 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.323954 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de547b7d-e8fc-4600-97cf-6ca332aedec8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b214fbcc640386ff8fd014404394f14a77bd9a8570ac1533357e7112f8ca4fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: E0314 08:58:50.332473 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.343959 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.358914 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.388661 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.402100 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.412987 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.443088 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.464776 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:50 crc kubenswrapper[4956]: I0314 08:58:50.482175 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:50Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.187467 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/2.log" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.188631 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/1.log" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.193934 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" exitCode=1 Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.194022 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.194108 4956 scope.go:117] "RemoveContainer" containerID="bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.195740 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 08:58:51 crc kubenswrapper[4956]: E0314 08:58:51.196144 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.209537 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.209670 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.209826 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:51 crc kubenswrapper[4956]: E0314 08:58:51.209833 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:51 crc kubenswrapper[4956]: E0314 08:58:51.209991 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:51 crc kubenswrapper[4956]: E0314 08:58:51.210631 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.215821 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de547b7d-e8fc-4600-97cf-6ca332aedec8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b214fbcc640386ff8fd014404394f14a77bd9a8570ac1533357e7112f8ca4fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.240596 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.258770 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.270413 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.287545 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.302571 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.327549 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.343809 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.369896 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.395835 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.417307 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.428069 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.447597 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb2967e845f21c9f5878c4c02496b072b17b3747f3910d0e34a247cc1ad73326\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:35Z\\\",\\\"message\\\":\\\"etwork_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:58:35.961518 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.962025 7031 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0314 08:58:35.962029 7031 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0314 08:58:35.962032 7031 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0314 08:58:35.961523 7031 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:58:35.962040 7031 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0314 08:58:35.962058 7031 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0314 08:58:35.961526 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:50Z\\\",\\\"message\\\":\\\"ne-config-operator/kube-rbac-proxy-crio-crc openshift-multus/multus-sgnxb openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-etcd/etcd-crc openshift-machine-config-operator/machine-config-daemon-mxjrk openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68]\\\\nI0314 08:58:50.179372 7262 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0314 08:58:50.179392 7262 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68\\\\nI0314 08:58:50.179388 7262 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI0314 08:58:50.179419 7262 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 4.345808ms\\\\nI0314 08:58:50.179424 7262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:58:50.179449 7262 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nF0314 08:58:50.179529 7262 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.462064 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.479164 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.494814 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.510979 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:51 crc kubenswrapper[4956]: I0314 08:58:51.528264 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.200036 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/2.log" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.205853 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 08:58:52 crc kubenswrapper[4956]: E0314 08:58:52.206103 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.208990 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:52 crc kubenswrapper[4956]: E0314 08:58:52.209137 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.222803 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.243702 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.265157 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.277264 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.293502 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.303424 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de547b7d-e8fc-4600-97cf-6ca332aedec8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b214fbcc640386ff8fd014404394f14a77bd9a8570ac1533357e7112f8ca4fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.314299 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.325817 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.334467 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.345442 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.355654 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.373433 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.386506 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.397788 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.410754 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.425265 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.437123 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:52 crc kubenswrapper[4956]: I0314 08:58:52.454729 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:50Z\\\",\\\"message\\\":\\\"ne-config-operator/kube-rbac-proxy-crio-crc openshift-multus/multus-sgnxb openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-etcd/etcd-crc openshift-machine-config-operator/machine-config-daemon-mxjrk openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68]\\\\nI0314 08:58:50.179372 7262 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0314 08:58:50.179392 7262 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68\\\\nI0314 08:58:50.179388 7262 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI0314 08:58:50.179419 7262 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 4.345808ms\\\\nI0314 08:58:50.179424 7262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:58:50.179449 7262 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nF0314 08:58:50.179529 7262 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:52Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:53 crc kubenswrapper[4956]: I0314 08:58:53.208655 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:53 crc kubenswrapper[4956]: I0314 08:58:53.208655 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:53 crc kubenswrapper[4956]: I0314 08:58:53.209106 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:53 crc kubenswrapper[4956]: E0314 08:58:53.209337 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:53 crc kubenswrapper[4956]: E0314 08:58:53.209588 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:53 crc kubenswrapper[4956]: E0314 08:58:53.209656 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.209436 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:54 crc kubenswrapper[4956]: E0314 08:58:54.209787 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.298439 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:54 crc kubenswrapper[4956]: E0314 08:58:54.298779 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:54 crc kubenswrapper[4956]: E0314 08:58:54.298942 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:59:10.298912146 +0000 UTC m=+155.811604454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.910144 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.910204 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.910222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.910248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.910271 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:54Z","lastTransitionTime":"2026-03-14T08:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:54 crc kubenswrapper[4956]: E0314 08:58:54.928310 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.933359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.933434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.933458 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.933579 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.933614 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:54Z","lastTransitionTime":"2026-03-14T08:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:54 crc kubenswrapper[4956]: E0314 08:58:54.953565 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.958272 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.958354 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.958380 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.958432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.958458 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:54Z","lastTransitionTime":"2026-03-14T08:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:54 crc kubenswrapper[4956]: E0314 08:58:54.977885 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.984427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.984533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.984560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.984590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:54 crc kubenswrapper[4956]: I0314 08:58:54.984615 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:54Z","lastTransitionTime":"2026-03-14T08:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.001002 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.006667 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.006902 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.007037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.007181 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.007307 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:58:55Z","lastTransitionTime":"2026-03-14T08:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.029263 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"da1aafa5-6606-474c-a451-a259d5bddf37\\\",\\\"systemUUID\\\":\\\"6518bcde-aa50-4603-92c7-71dcf31294f9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.029376 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.208611 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.208736 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.208788 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.208639 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.208988 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.209042 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.220848 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-42pn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5d9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-42pn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.231420 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de547b7d-e8fc-4600-97cf-6ca332aedec8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b214fbcc640386ff8fd014404394f14a77bd9a8570ac1533357e7112f8ca4fa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8f2eca23cbbc7619ab093bc76dca55f111fbc069a1a93e88e9ad077d5ac3341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.249893 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4ad91ef31e1c231b41a809d4ceed37a5b26f4e096f78886b03462acada2ba2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb97188d743ebe0f9eb0b5dd4638c37d06ff928d9b5248a9e602975c94810ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.273715 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2131e0d-5d1b-4913-8908-e15859b063a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9460a21d093786b0d6c6875b9a0f638f24445d469c310b5df7bd9485de39e38f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c826890c9333ecc9140e76665d24a6dd1f02d008ceb43765c2eb7a0d9cc3112b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825a071315bf8669878a08a4d2404c39f2b469dd25a30f640a25f687b25a07d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66630f98a682f9564cd4878975bd9339dbaf830d1fee7bd0717748088a5af4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5735597b45a72289c7eab8f20a1496e553a8ced1306173d230d0006c4039818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d52afc8376bcacae24e22677e29926cb2fde172e51f50f435a56116c5239e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4251864b968e84460ff4769824f5105418be0af635ce5bc03a21d1e300d155f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5xlw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.285596 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gqfh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7535708d-1eec-4d4c-b0eb-f5343af71b3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2f7f4cb8ba82eb7babdaee7c7f92b52d626631bfda4865041d2da040c3a121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rs6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gqfh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.304942 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba4fa81-8b34-4c43-aef8-84072b0bc8fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd04728ee48b04ef5bd11e8558af512b2280a0ef7b55fbf5761a62824cd00124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9cb47bd1915edd2adeed0e00ccc8086f52fd9a8d5ca7a320ee50e634a31fa43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftc8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdp68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: E0314 08:58:55.333074 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.342128 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a5e586-f77e-44bc-b604-276c88fe8626\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aae18cc9cbd1fc98d6f3baabfd134aa494d1f8b542e02ffb34bd4d9fd541935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47ad5cf31189314bc61886c11745185d3096716f13db76fd76d88b07a20ea92f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a577f905ca33070d122aadc703100bc2a88ba8eedcf05fe8639581e7870599e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5577ddba169fdf65e4230396f9336d3d3e902611776bd48d840223585755aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ca26922a8b738513a4595874a2d8af4e7ad735c710506f3bf53225be5b44b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08d50be29a9895c68d9709fa6e5a716868fad2480f7ee805e43759d5c284d300\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c183ef17fa37f17bffc0d7137947a190678ffb7074bcbc2cd4a2981e203d534\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c15a3da06e564c8760bed691790ffd8984150e60ea40863d64b3a630f317c7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.362618 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.383211 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac846c674a5660699db42042e3ee36f53edd309cbdc123f32e00249057fdc5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.399209 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"398abfcc-e8de-4f30-ae7a-f20c3120f379\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:57:39Z\\\",\\\"message\\\":\\\"W0314 08:57:39.033935 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0314 08:57:39.034352 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773478659 cert, and key in /tmp/serving-cert-3443639880/serving-signer.crt, /tmp/serving-cert-3443639880/serving-signer.key\\\\nI0314 08:57:39.330926 1 observer_polling.go:159] Starting file observer\\\\nW0314 08:57:39.354645 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0314 08:57:39.354993 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:57:39.356282 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3443639880/tls.crt::/tmp/serving-cert-3443639880/tls.key\\\\\\\"\\\\nF0314 08:57:39.815515 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:57:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.418637 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.433700 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h264v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7e96975-9562-47ec-9476-e593f8d6be98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465d47ec4f0cc59c8875084deefaf1da41f0a83ac5330b508ab2ed5998fbd331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mblhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h264v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.471155 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57d4b4cb-2115-421e-8f2a-491ec851328c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:58:50Z\\\",\\\"message\\\":\\\"ne-config-operator/kube-rbac-proxy-crio-crc openshift-multus/multus-sgnxb openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-etcd/etcd-crc openshift-machine-config-operator/machine-config-daemon-mxjrk openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68]\\\\nI0314 08:58:50.179372 7262 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0314 08:58:50.179392 7262 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68\\\\nI0314 08:58:50.179388 7262 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI0314 08:58:50.179419 7262 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 4.345808ms\\\\nI0314 08:58:50.179424 7262 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:58:50.179449 7262 services_controller.go:356] Processing sync for service openshift-machine-config-operator/machine-config-controller for network=default\\\\nF0314 08:58:50.179529 7262 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkxwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj4pg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.485268 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgnxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7528e098-09d4-436f-a32d-a0e82e76b8e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82zwx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgnxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.502136 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4341423-d5ba-43c2-bd6a-5453fd44fc5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61df314532b1530d4cba285e9e3609bd1dcaadf4765e36598f39ef48dd1e2820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0632bb1c3201e63cf291ee337d194644f26e39faa514208245f786e2361d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d8a488da4b82cb59afe8f1bc1f5bc50e22599377c3b2dde9d4504432958850b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0620e3230f323dd161b55df40d7c023859b07fae37061bc9a743365d847993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:56:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:56:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:56:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.521802 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a037582aa085d9c39a1456ed76cb11da8ff08966d8e49437fe940a8643b4c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.538401 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:57:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:55 crc kubenswrapper[4956]: I0314 08:58:55.551540 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ba20367-e506-422e-a846-eb1525cb3b94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:58:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706b4c81de409b0311bc38821512c0147393b5d2ddc2648ad635fc4efe449c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cczk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:58:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mxjrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:58:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:58:56 crc kubenswrapper[4956]: I0314 08:58:56.208614 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:56 crc kubenswrapper[4956]: E0314 08:58:56.208735 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:57 crc kubenswrapper[4956]: I0314 08:58:57.209337 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:57 crc kubenswrapper[4956]: I0314 08:58:57.209418 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:57 crc kubenswrapper[4956]: E0314 08:58:57.209593 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:58:57 crc kubenswrapper[4956]: E0314 08:58:57.209693 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:57 crc kubenswrapper[4956]: I0314 08:58:57.209727 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:57 crc kubenswrapper[4956]: E0314 08:58:57.209801 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:57 crc kubenswrapper[4956]: I0314 08:58:57.224626 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 14 08:58:58 crc kubenswrapper[4956]: I0314 08:58:58.208818 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:58:58 crc kubenswrapper[4956]: E0314 08:58:58.208972 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:58:59 crc kubenswrapper[4956]: I0314 08:58:59.209561 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:58:59 crc kubenswrapper[4956]: I0314 08:58:59.209624 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:58:59 crc kubenswrapper[4956]: E0314 08:58:59.210157 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:58:59 crc kubenswrapper[4956]: I0314 08:58:59.210541 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:58:59 crc kubenswrapper[4956]: E0314 08:58:59.210760 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:58:59 crc kubenswrapper[4956]: E0314 08:58:59.211522 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:00 crc kubenswrapper[4956]: I0314 08:59:00.208999 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:00 crc kubenswrapper[4956]: E0314 08:59:00.209154 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:00 crc kubenswrapper[4956]: E0314 08:59:00.334838 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:59:01 crc kubenswrapper[4956]: I0314 08:59:01.209046 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:01 crc kubenswrapper[4956]: I0314 08:59:01.209121 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:01 crc kubenswrapper[4956]: I0314 08:59:01.209283 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:01 crc kubenswrapper[4956]: E0314 08:59:01.209408 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:01 crc kubenswrapper[4956]: E0314 08:59:01.209635 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:01 crc kubenswrapper[4956]: E0314 08:59:01.209727 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:02 crc kubenswrapper[4956]: I0314 08:59:02.208993 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:02 crc kubenswrapper[4956]: E0314 08:59:02.209226 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.090757 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.090931 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 09:00:07.090898832 +0000 UTC m=+212.603591140 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.090999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.091051 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.091202 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.091266 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.092371 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 09:00:07.091276022 +0000 UTC m=+212.603968330 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.092471 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 09:00:07.092437291 +0000 UTC m=+212.605129609 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.192363 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.192456 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.192695 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.192721 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.192740 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.192838 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 09:00:07.192816421 +0000 UTC m=+212.705508729 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.192901 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.192985 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.193003 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.193099 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 09:00:07.193073077 +0000 UTC m=+212.705765545 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.208422 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.208428 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.208683 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.208873 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:03 crc kubenswrapper[4956]: I0314 08:59:03.208429 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:03 crc kubenswrapper[4956]: E0314 08:59:03.209017 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:04 crc kubenswrapper[4956]: I0314 08:59:04.208718 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:04 crc kubenswrapper[4956]: E0314 08:59:04.208897 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.208530 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.208550 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:05 crc kubenswrapper[4956]: E0314 08:59:05.208743 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.208843 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:05 crc kubenswrapper[4956]: E0314 08:59:05.208902 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:05 crc kubenswrapper[4956]: E0314 08:59:05.209063 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.210181 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 08:59:05 crc kubenswrapper[4956]: E0314 08:59:05.210462 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj4pg_openshift-ovn-kubernetes(57d4b4cb-2115-421e-8f2a-491ec851328c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.248418 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=54.248403777 podStartE2EDuration="54.248403777s" podCreationTimestamp="2026-03-14 08:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.234987063 +0000 UTC m=+150.747679321" watchObservedRunningTime="2026-03-14 08:59:05.248403777 +0000 UTC m=+150.761096045" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.261691 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h264v" podStartSLOduration=77.261672627 podStartE2EDuration="1m17.261672627s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.261618506 +0000 UTC m=+150.774310814" watchObservedRunningTime="2026-03-14 08:59:05.261672627 +0000 UTC m=+150.774364895" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.302616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.302691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.302684 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=8.302669208 podStartE2EDuration="8.302669208s" podCreationTimestamp="2026-03-14 08:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.30195419 +0000 UTC m=+150.814646478" watchObservedRunningTime="2026-03-14 08:59:05.302669208 +0000 UTC m=+150.815361486" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.303061 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.303222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.303242 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:59:05Z","lastTransitionTime":"2026-03-14T08:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.317053 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.317032016 podStartE2EDuration="38.317032016s" podCreationTimestamp="2026-03-14 08:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.316933414 +0000 UTC m=+150.829625702" watchObservedRunningTime="2026-03-14 08:59:05.317032016 +0000 UTC m=+150.829724294" Mar 14 08:59:05 crc kubenswrapper[4956]: E0314 08:59:05.335349 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.350622 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6"] Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.351027 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.353195 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.353308 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.353829 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.354286 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.380443 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podStartSLOduration=77.380428355 podStartE2EDuration="1m17.380428355s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.379963423 +0000 UTC m=+150.892655691" watchObservedRunningTime="2026-03-14 08:59:05.380428355 +0000 UTC m=+150.893120623" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.391684 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sgnxb" podStartSLOduration=77.391674245 podStartE2EDuration="1m17.391674245s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.39147209 +0000 UTC m=+150.904164358" watchObservedRunningTime="2026-03-14 08:59:05.391674245 +0000 UTC m=+150.904366513" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.413006 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.412992796 podStartE2EDuration="18.412992796s" podCreationTimestamp="2026-03-14 08:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.401186042 +0000 UTC m=+150.913878310" watchObservedRunningTime="2026-03-14 08:59:05.412992796 +0000 UTC m=+150.925685064" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.415041 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25e45193-cbd7-4d19-be68-38e0008a5a3b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.415179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25e45193-cbd7-4d19-be68-38e0008a5a3b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.415228 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25e45193-cbd7-4d19-be68-38e0008a5a3b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.415278 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25e45193-cbd7-4d19-be68-38e0008a5a3b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.415397 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e45193-cbd7-4d19-be68-38e0008a5a3b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.431855 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5xlw2" podStartSLOduration=77.431837525 podStartE2EDuration="1m17.431837525s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.431321542 +0000 UTC m=+150.944013810" watchObservedRunningTime="2026-03-14 08:59:05.431837525 +0000 UTC m=+150.944529793" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.459085 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gqfh2" podStartSLOduration=77.459057773 podStartE2EDuration="1m17.459057773s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.447839494 +0000 UTC m=+150.960531782" watchObservedRunningTime="2026-03-14 08:59:05.459057773 +0000 UTC m=+150.971750081" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.459939 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdp68" podStartSLOduration=76.459924515 podStartE2EDuration="1m16.459924515s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.458626152 +0000 UTC m=+150.971318430" watchObservedRunningTime="2026-03-14 08:59:05.459924515 +0000 UTC m=+150.972616823" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.494857 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=45.494839504 podStartE2EDuration="45.494839504s" podCreationTimestamp="2026-03-14 08:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:05.493358917 +0000 UTC m=+151.006051215" watchObservedRunningTime="2026-03-14 08:59:05.494839504 +0000 UTC m=+151.007531782" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.516420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e45193-cbd7-4d19-be68-38e0008a5a3b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.516823 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25e45193-cbd7-4d19-be68-38e0008a5a3b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.517007 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25e45193-cbd7-4d19-be68-38e0008a5a3b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.517081 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25e45193-cbd7-4d19-be68-38e0008a5a3b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.517154 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25e45193-cbd7-4d19-be68-38e0008a5a3b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.517268 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25e45193-cbd7-4d19-be68-38e0008a5a3b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.517332 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25e45193-cbd7-4d19-be68-38e0008a5a3b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.518303 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25e45193-cbd7-4d19-be68-38e0008a5a3b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.524589 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e45193-cbd7-4d19-be68-38e0008a5a3b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.532558 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25e45193-cbd7-4d19-be68-38e0008a5a3b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mwsr6\" (UID: \"25e45193-cbd7-4d19-be68-38e0008a5a3b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:05 crc kubenswrapper[4956]: I0314 08:59:05.666314 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" Mar 14 08:59:06 crc kubenswrapper[4956]: I0314 08:59:06.208781 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:06 crc kubenswrapper[4956]: E0314 08:59:06.209731 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:06 crc kubenswrapper[4956]: I0314 08:59:06.238889 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 14 08:59:06 crc kubenswrapper[4956]: I0314 08:59:06.252173 4956 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 08:59:06 crc kubenswrapper[4956]: I0314 08:59:06.255948 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" event={"ID":"25e45193-cbd7-4d19-be68-38e0008a5a3b","Type":"ContainerStarted","Data":"f6107a3a3e22f3b5069a6502fe016ab3b4c91c28959e1c698609c1dfe9c37397"} Mar 14 08:59:06 crc kubenswrapper[4956]: I0314 08:59:06.256003 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" event={"ID":"25e45193-cbd7-4d19-be68-38e0008a5a3b","Type":"ContainerStarted","Data":"27ada68bd240107597ecc6d6afed422147c308b14f2b7adbb6bd3444e5e531a2"} Mar 14 08:59:07 crc kubenswrapper[4956]: I0314 08:59:07.208431 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:07 crc kubenswrapper[4956]: I0314 08:59:07.208536 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:07 crc kubenswrapper[4956]: E0314 08:59:07.208620 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:07 crc kubenswrapper[4956]: E0314 08:59:07.208677 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:07 crc kubenswrapper[4956]: I0314 08:59:07.208679 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:07 crc kubenswrapper[4956]: E0314 08:59:07.208894 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:08 crc kubenswrapper[4956]: I0314 08:59:08.208558 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:08 crc kubenswrapper[4956]: E0314 08:59:08.208691 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:09 crc kubenswrapper[4956]: I0314 08:59:09.208630 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:09 crc kubenswrapper[4956]: I0314 08:59:09.208695 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:09 crc kubenswrapper[4956]: I0314 08:59:09.208751 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:09 crc kubenswrapper[4956]: E0314 08:59:09.208894 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:09 crc kubenswrapper[4956]: E0314 08:59:09.209046 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:09 crc kubenswrapper[4956]: E0314 08:59:09.209146 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:10 crc kubenswrapper[4956]: I0314 08:59:10.208640 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:10 crc kubenswrapper[4956]: E0314 08:59:10.208768 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:10 crc kubenswrapper[4956]: E0314 08:59:10.336419 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:59:10 crc kubenswrapper[4956]: I0314 08:59:10.370294 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:10 crc kubenswrapper[4956]: E0314 08:59:10.370550 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:59:10 crc kubenswrapper[4956]: E0314 08:59:10.370644 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs podName:bf6ad235-d99c-46a7-8c2d-6fc12fc07c10 nodeName:}" failed. No retries permitted until 2026-03-14 08:59:42.370624936 +0000 UTC m=+187.883317194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs") pod "network-metrics-daemon-42pn5" (UID: "bf6ad235-d99c-46a7-8c2d-6fc12fc07c10") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:59:11 crc kubenswrapper[4956]: I0314 08:59:11.208894 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:11 crc kubenswrapper[4956]: I0314 08:59:11.208947 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:11 crc kubenswrapper[4956]: I0314 08:59:11.208893 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:11 crc kubenswrapper[4956]: E0314 08:59:11.209041 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:11 crc kubenswrapper[4956]: E0314 08:59:11.209235 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:11 crc kubenswrapper[4956]: E0314 08:59:11.209271 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:12 crc kubenswrapper[4956]: I0314 08:59:12.208380 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:12 crc kubenswrapper[4956]: E0314 08:59:12.208585 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:12 crc kubenswrapper[4956]: I0314 08:59:12.283156 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgnxb_7528e098-09d4-436f-a32d-a0e82e76b8e0/kube-multus/0.log" Mar 14 08:59:12 crc kubenswrapper[4956]: I0314 08:59:12.283239 4956 generic.go:334] "Generic (PLEG): container finished" podID="7528e098-09d4-436f-a32d-a0e82e76b8e0" containerID="cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767" exitCode=1 Mar 14 08:59:12 crc kubenswrapper[4956]: I0314 08:59:12.283292 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgnxb" event={"ID":"7528e098-09d4-436f-a32d-a0e82e76b8e0","Type":"ContainerDied","Data":"cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767"} Mar 14 08:59:12 crc kubenswrapper[4956]: I0314 08:59:12.284082 4956 scope.go:117] "RemoveContainer" containerID="cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767" Mar 14 08:59:12 crc kubenswrapper[4956]: I0314 08:59:12.312802 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mwsr6" podStartSLOduration=84.312761236 podStartE2EDuration="1m24.312761236s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:06.272930202 +0000 UTC m=+151.785622470" watchObservedRunningTime="2026-03-14 08:59:12.312761236 +0000 UTC m=+157.825453544" Mar 14 08:59:13 crc kubenswrapper[4956]: I0314 08:59:13.208734 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:13 crc kubenswrapper[4956]: I0314 08:59:13.209648 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:13 crc kubenswrapper[4956]: E0314 08:59:13.209795 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:13 crc kubenswrapper[4956]: I0314 08:59:13.208734 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:13 crc kubenswrapper[4956]: E0314 08:59:13.210030 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:13 crc kubenswrapper[4956]: E0314 08:59:13.210316 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:13 crc kubenswrapper[4956]: I0314 08:59:13.290969 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgnxb_7528e098-09d4-436f-a32d-a0e82e76b8e0/kube-multus/0.log" Mar 14 08:59:13 crc kubenswrapper[4956]: I0314 08:59:13.291091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgnxb" event={"ID":"7528e098-09d4-436f-a32d-a0e82e76b8e0","Type":"ContainerStarted","Data":"e11575d346470f0c65bf883c0676009985f639d05b04ccb994919585ff0ae99a"} Mar 14 08:59:14 crc kubenswrapper[4956]: I0314 08:59:14.209045 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:14 crc kubenswrapper[4956]: E0314 08:59:14.209168 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:15 crc kubenswrapper[4956]: I0314 08:59:15.208691 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:15 crc kubenswrapper[4956]: I0314 08:59:15.208818 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:15 crc kubenswrapper[4956]: E0314 08:59:15.210680 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:15 crc kubenswrapper[4956]: I0314 08:59:15.210760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:15 crc kubenswrapper[4956]: E0314 08:59:15.210972 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:15 crc kubenswrapper[4956]: E0314 08:59:15.211044 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:15 crc kubenswrapper[4956]: E0314 08:59:15.336965 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:59:16 crc kubenswrapper[4956]: I0314 08:59:16.208598 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:16 crc kubenswrapper[4956]: E0314 08:59:16.208730 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:17 crc kubenswrapper[4956]: I0314 08:59:17.208654 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:17 crc kubenswrapper[4956]: I0314 08:59:17.208755 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:17 crc kubenswrapper[4956]: I0314 08:59:17.208849 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:17 crc kubenswrapper[4956]: E0314 08:59:17.208859 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:17 crc kubenswrapper[4956]: E0314 08:59:17.209004 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:17 crc kubenswrapper[4956]: E0314 08:59:17.209250 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:18 crc kubenswrapper[4956]: I0314 08:59:18.208641 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:18 crc kubenswrapper[4956]: E0314 08:59:18.208764 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:18 crc kubenswrapper[4956]: I0314 08:59:18.209367 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.037717 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-42pn5"] Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.037849 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:19 crc kubenswrapper[4956]: E0314 08:59:19.037990 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.209450 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.209514 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.209554 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:19 crc kubenswrapper[4956]: E0314 08:59:19.209676 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:19 crc kubenswrapper[4956]: E0314 08:59:19.209824 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:19 crc kubenswrapper[4956]: E0314 08:59:19.209989 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.313521 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/2.log" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.316284 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerStarted","Data":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.317188 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:59:19 crc kubenswrapper[4956]: I0314 08:59:19.349460 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podStartSLOduration=91.349433844 podStartE2EDuration="1m31.349433844s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:19.346650265 +0000 UTC m=+164.859342543" watchObservedRunningTime="2026-03-14 08:59:19.349433844 +0000 UTC m=+164.862126132" Mar 14 08:59:20 crc kubenswrapper[4956]: E0314 08:59:20.337831 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:59:21 crc kubenswrapper[4956]: I0314 08:59:21.208932 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:21 crc kubenswrapper[4956]: I0314 08:59:21.208991 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:21 crc kubenswrapper[4956]: I0314 08:59:21.208945 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:21 crc kubenswrapper[4956]: I0314 08:59:21.209051 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:21 crc kubenswrapper[4956]: E0314 08:59:21.209087 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:21 crc kubenswrapper[4956]: E0314 08:59:21.209189 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:21 crc kubenswrapper[4956]: E0314 08:59:21.209259 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:21 crc kubenswrapper[4956]: E0314 08:59:21.209325 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:23 crc kubenswrapper[4956]: I0314 08:59:23.209635 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:23 crc kubenswrapper[4956]: I0314 08:59:23.209683 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:23 crc kubenswrapper[4956]: I0314 08:59:23.209733 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:23 crc kubenswrapper[4956]: I0314 08:59:23.209664 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:23 crc kubenswrapper[4956]: E0314 08:59:23.209845 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:23 crc kubenswrapper[4956]: E0314 08:59:23.210228 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:23 crc kubenswrapper[4956]: E0314 08:59:23.210332 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:23 crc kubenswrapper[4956]: E0314 08:59:23.210034 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.208409 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.208408 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:25 crc kubenswrapper[4956]: E0314 08:59:25.210687 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.210756 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.210768 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:25 crc kubenswrapper[4956]: E0314 08:59:25.210891 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:59:25 crc kubenswrapper[4956]: E0314 08:59:25.211117 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:59:25 crc kubenswrapper[4956]: E0314 08:59:25.211255 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42pn5" podUID="bf6ad235-d99c-46a7-8c2d-6fc12fc07c10" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.747356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.796084 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcs4p"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.797200 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.798393 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8gwm6"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.799691 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.800457 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.801105 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.802164 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.802818 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.804203 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.804833 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.805594 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m4xmw"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.806107 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.807601 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bkrj"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.808357 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.809915 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.810173 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.810512 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.810683 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.810935 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.811975 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.815135 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.815311 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.815692 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.815694 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.816105 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.816323 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.816498 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.816907 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.817235 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.817339 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.817625 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.817263 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.818361 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.818667 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.818937 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.819565 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.820552 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.820877 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.820248 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.821851 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.821965 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.822933 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.823671 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.823818 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.824302 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.824356 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.824615 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.824720 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.824761 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.824872 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825066 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825194 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826884 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-776c5"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.832856 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825209 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825466 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825627 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825672 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.834898 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826074 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826125 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826612 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826648 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826675 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826706 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826714 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826764 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.825066 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826815 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.826822 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.827317 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.827406 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.829321 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.829753 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.839133 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.839584 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.839699 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.840398 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.840582 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.841448 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8qng2"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.842086 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.842761 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lpkrv"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.843302 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.843734 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.844125 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.845013 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.846076 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.872000 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.876565 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.890707 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.892832 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893054 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893224 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893305 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893576 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893609 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893792 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5pqds"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.893840 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.894634 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.894752 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.895039 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.895131 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j9qjz"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.895631 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.895803 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896182 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896232 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896425 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896442 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d9kk5"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896591 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896717 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.896956 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.897300 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.897366 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.897585 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.897834 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.898187 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.898236 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9s4fs"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.898183 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.898888 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.898942 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.899178 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.900010 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.901788 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.901935 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.901981 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.902032 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.902362 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.903381 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hsfss"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.903990 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.905177 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.908792 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.909009 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.911010 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.912544 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.916051 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.917272 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.917992 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.918773 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.921535 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bkrj"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.922520 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.923001 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.923310 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.923407 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.923514 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.941671 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.942085 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.942224 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.942733 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.943036 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.943201 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.943903 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea621d77-5364-416c-a378-6adf0e89fc30-audit-dir\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.943946 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-config\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.943978 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954101 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954251 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954438 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954473 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8gwm6"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.944011 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-audit-policies\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jltv\" (UniqueName: \"kubernetes.io/projected/22f83565-681d-490b-bd27-d21b456c6e25-kube-api-access-8jltv\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954634 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-encryption-config\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954655 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/095068b1-bf13-43a2-a250-a0eaeb60c6ae-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954674 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954691 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6th\" (UniqueName: \"kubernetes.io/projected/13a17846-77ac-4dba-a573-3b7d5c67da8d-kube-api-access-ct6th\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954709 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-service-ca\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954729 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954745 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954761 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954775 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-serving-cert\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954794 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwtw\" (UniqueName: \"kubernetes.io/projected/ea621d77-5364-416c-a378-6adf0e89fc30-kube-api-access-gzwtw\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954810 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-oauth-config\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954827 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ffbc86-37af-4fa5-bc0c-58084b961597-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954844 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2xs\" (UniqueName: \"kubernetes.io/projected/9231d09e-fc87-40bf-85fe-c1d16e3ee943-kube-api-access-lt2xs\") pod \"dns-operator-744455d44c-776c5\" (UID: \"9231d09e-fc87-40bf-85fe-c1d16e3ee943\") " pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954861 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954876 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/966098d4-f89c-4deb-a6e9-b1fae3316324-node-pullsecrets\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954891 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/966098d4-f89c-4deb-a6e9-b1fae3316324-audit-dir\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954907 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-audit\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-etcd-client\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954938 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-image-import-ca\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954957 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.954976 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955002 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4wv\" (UniqueName: \"kubernetes.io/projected/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-kube-api-access-px4wv\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955021 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13a17846-77ac-4dba-a573-3b7d5c67da8d-auth-proxy-config\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955036 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-config\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955051 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ffbc86-37af-4fa5-bc0c-58084b961597-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955070 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-config\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955084 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85crd\" (UniqueName: \"kubernetes.io/projected/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-kube-api-access-85crd\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955097 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-encryption-config\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955112 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gr2m\" (UniqueName: \"kubernetes.io/projected/44ffbc86-37af-4fa5-bc0c-58084b961597-kube-api-access-9gr2m\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955129 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-serving-cert\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955144 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnj8b\" (UniqueName: \"kubernetes.io/projected/095068b1-bf13-43a2-a250-a0eaeb60c6ae-kube-api-access-fnj8b\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955159 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-config\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955176 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-serving-cert\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955335 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-serving-cert\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955375 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22f83565-681d-490b-bd27-d21b456c6e25-audit-dir\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955417 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a17846-77ac-4dba-a573-3b7d5c67da8d-config\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955592 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955700 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6b2\" (UniqueName: \"kubernetes.io/projected/66ba3ebe-86e2-4711-a87d-9505fef09f76-kube-api-access-qh6b2\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rw7\" (UniqueName: \"kubernetes.io/projected/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-kube-api-access-r7rw7\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955772 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhkkw\" (UniqueName: \"kubernetes.io/projected/966098d4-f89c-4deb-a6e9-b1fae3316324-kube-api-access-mhkkw\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955795 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9231d09e-fc87-40bf-85fe-c1d16e3ee943-metrics-tls\") pod \"dns-operator-744455d44c-776c5\" (UID: \"9231d09e-fc87-40bf-85fe-c1d16e3ee943\") " pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955850 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955939 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.955988 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktt5\" (UniqueName: \"kubernetes.io/projected/343e0673-aad7-49c1-91b7-f5fd88579db3-kube-api-access-pktt5\") pod \"downloads-7954f5f757-lpkrv\" (UID: \"343e0673-aad7-49c1-91b7-f5fd88579db3\") " pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956014 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/13a17846-77ac-4dba-a573-3b7d5c67da8d-machine-approver-tls\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956038 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-oauth-serving-cert\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956067 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095068b1-bf13-43a2-a250-a0eaeb60c6ae-config\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956124 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956148 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956168 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4443557c-c15b-435f-80a8-44916ff7c31a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-st9hs\" (UID: \"4443557c-c15b-435f-80a8-44916ff7c31a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956196 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/095068b1-bf13-43a2-a250-a0eaeb60c6ae-images\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956217 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqfls\" (UniqueName: \"kubernetes.io/projected/4443557c-c15b-435f-80a8-44916ff7c31a-kube-api-access-fqfls\") pod \"cluster-samples-operator-665b6dd947-st9hs\" (UID: \"4443557c-c15b-435f-80a8-44916ff7c31a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956284 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-trusted-ca-bundle\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956335 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ba3ebe-86e2-4711-a87d-9505fef09f76-serving-cert\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956360 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-etcd-client\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956380 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v95dt\" (UniqueName: \"kubernetes.io/projected/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-kube-api-access-v95dt\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956416 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956523 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956559 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-audit-policies\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956581 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956618 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956646 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956668 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-client-ca\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956690 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.956763 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.957231 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m4xmw"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.957621 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.962138 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.963603 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.964326 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.966407 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.969006 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.969381 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.970193 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.970471 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.970969 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.971585 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.972298 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.972446 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hj4h"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.973189 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.974664 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mqkjv"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.975636 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.976002 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.976788 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.977545 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.978125 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.978533 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.978901 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.979680 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dfkjs"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.980137 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.980610 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.980910 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.981808 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.982168 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.982765 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.984347 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.984979 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.985583 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.985787 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.986030 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lpkrv"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.987578 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-776c5"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.999218 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt"] Mar 14 08:59:25 crc kubenswrapper[4956]: I0314 08:59:25.999572 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.000912 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j9qjz"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.004343 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.004626 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcs4p"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.006054 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.007336 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.008787 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jqf62"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.009703 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.010394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9s4fs"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.011444 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.012578 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8qng2"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.013617 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.014622 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.017761 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.019753 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.019774 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mqkjv"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.019880 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.021389 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dfkjs"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.023012 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.024034 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.025357 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.027415 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.028744 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hj4h"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.030231 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.031448 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.032818 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d9kk5"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.035247 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.038008 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m5sxz"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.043188 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.049378 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.050368 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5pqds"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.050453 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.050468 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.053067 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-d7cv4"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.053874 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m5sxz"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.053999 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.054463 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057432 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057542 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea621d77-5364-416c-a378-6adf0e89fc30-audit-dir\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057591 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-config\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057612 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057633 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057651 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jltv\" (UniqueName: \"kubernetes.io/projected/22f83565-681d-490b-bd27-d21b456c6e25-kube-api-access-8jltv\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057669 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-encryption-config\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057688 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-audit-policies\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057705 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/095068b1-bf13-43a2-a250-a0eaeb60c6ae-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057721 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw2v\" (UniqueName: \"kubernetes.io/projected/b315e659-0690-463c-88c4-659124922ddc-kube-api-access-mrw2v\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057740 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057755 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-images\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057773 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057789 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6th\" (UniqueName: \"kubernetes.io/projected/13a17846-77ac-4dba-a573-3b7d5c67da8d-kube-api-access-ct6th\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057805 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-service-ca\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057823 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057842 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057859 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-serving-cert\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057875 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ffbc86-37af-4fa5-bc0c-58084b961597-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057891 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwtw\" (UniqueName: \"kubernetes.io/projected/ea621d77-5364-416c-a378-6adf0e89fc30-kube-api-access-gzwtw\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057905 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-oauth-config\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057920 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057934 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/966098d4-f89c-4deb-a6e9-b1fae3316324-node-pullsecrets\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057951 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/966098d4-f89c-4deb-a6e9-b1fae3316324-audit-dir\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057967 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2xs\" (UniqueName: \"kubernetes.io/projected/9231d09e-fc87-40bf-85fe-c1d16e3ee943-kube-api-access-lt2xs\") pod \"dns-operator-744455d44c-776c5\" (UID: \"9231d09e-fc87-40bf-85fe-c1d16e3ee943\") " pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057983 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/236467b8-ffbf-4e32-ba1c-b188938f8ff5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.057999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058016 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058033 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-audit\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-etcd-client\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058065 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-image-import-ca\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058083 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbrf\" (UniqueName: \"kubernetes.io/projected/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-kube-api-access-5jbrf\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058099 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-proxy-tls\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058114 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058132 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4wv\" (UniqueName: \"kubernetes.io/projected/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-kube-api-access-px4wv\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058149 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-config\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058150 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea621d77-5364-416c-a378-6adf0e89fc30-audit-dir\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058167 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85crd\" (UniqueName: \"kubernetes.io/projected/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-kube-api-access-85crd\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058187 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-encryption-config\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13a17846-77ac-4dba-a573-3b7d5c67da8d-auth-proxy-config\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058249 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-config\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058265 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ffbc86-37af-4fa5-bc0c-58084b961597-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058303 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gr2m\" (UniqueName: \"kubernetes.io/projected/44ffbc86-37af-4fa5-bc0c-58084b961597-kube-api-access-9gr2m\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ht66\" (UniqueName: \"kubernetes.io/projected/1666768a-cdbb-4ab1-83d8-b1ad0444f167-kube-api-access-7ht66\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7wqv\" (UID: \"1666768a-cdbb-4ab1-83d8-b1ad0444f167\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058337 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-serving-cert\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnj8b\" (UniqueName: \"kubernetes.io/projected/095068b1-bf13-43a2-a250-a0eaeb60c6ae-kube-api-access-fnj8b\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058431 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-config\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058470 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-serving-cert\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058520 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b35280-bd38-4a0c-8192-807ce4f2eb0b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058540 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058555 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058595 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b616bb02-0180-4af3-922a-6a09d2da3d67-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b315e659-0690-463c-88c4-659124922ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058630 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22f83565-681d-490b-bd27-d21b456c6e25-audit-dir\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058666 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-serving-cert\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058686 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b616bb02-0180-4af3-922a-6a09d2da3d67-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058703 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7slv\" (UniqueName: \"kubernetes.io/projected/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-kube-api-access-x7slv\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058722 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058761 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058780 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6b2\" (UniqueName: \"kubernetes.io/projected/66ba3ebe-86e2-4711-a87d-9505fef09f76-kube-api-access-qh6b2\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058795 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a17846-77ac-4dba-a573-3b7d5c67da8d-config\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058832 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rw7\" (UniqueName: \"kubernetes.io/projected/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-kube-api-access-r7rw7\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058851 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xxwh\" (UniqueName: \"kubernetes.io/projected/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-kube-api-access-9xxwh\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058880 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058921 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058936 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhkkw\" (UniqueName: \"kubernetes.io/projected/966098d4-f89c-4deb-a6e9-b1fae3316324-kube-api-access-mhkkw\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058951 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9231d09e-fc87-40bf-85fe-c1d16e3ee943-metrics-tls\") pod \"dns-operator-744455d44c-776c5\" (UID: \"9231d09e-fc87-40bf-85fe-c1d16e3ee943\") " pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.058986 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktt5\" (UniqueName: \"kubernetes.io/projected/343e0673-aad7-49c1-91b7-f5fd88579db3-kube-api-access-pktt5\") pod \"downloads-7954f5f757-lpkrv\" (UID: \"343e0673-aad7-49c1-91b7-f5fd88579db3\") " pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059004 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/236467b8-ffbf-4e32-ba1c-b188938f8ff5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059033 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059065 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/13a17846-77ac-4dba-a573-3b7d5c67da8d-machine-approver-tls\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059079 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-oauth-serving-cert\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059094 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095068b1-bf13-43a2-a250-a0eaeb60c6ae-config\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059119 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4443557c-c15b-435f-80a8-44916ff7c31a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-st9hs\" (UID: \"4443557c-c15b-435f-80a8-44916ff7c31a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059156 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/095068b1-bf13-43a2-a250-a0eaeb60c6ae-images\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqfls\" (UniqueName: \"kubernetes.io/projected/4443557c-c15b-435f-80a8-44916ff7c31a-kube-api-access-fqfls\") pod \"cluster-samples-operator-665b6dd947-st9hs\" (UID: \"4443557c-c15b-435f-80a8-44916ff7c31a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059188 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-trusted-ca-bundle\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059205 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ba3ebe-86e2-4711-a87d-9505fef09f76-serving-cert\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059222 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1666768a-cdbb-4ab1-83d8-b1ad0444f167-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7wqv\" (UID: \"1666768a-cdbb-4ab1-83d8-b1ad0444f167\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059261 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzfq\" (UniqueName: \"kubernetes.io/projected/93b35280-bd38-4a0c-8192-807ce4f2eb0b-kube-api-access-2kzfq\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059290 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059324 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-etcd-client\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v95dt\" (UniqueName: \"kubernetes.io/projected/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-kube-api-access-v95dt\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059360 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b616bb02-0180-4af3-922a-6a09d2da3d67-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-config\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059376 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059422 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059440 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059458 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059508 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-audit-policies\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059575 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236467b8-ffbf-4e32-ba1c-b188938f8ff5-config\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059595 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059620 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059670 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059688 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-client-ca\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059705 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059728 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b315e659-0690-463c-88c4-659124922ddc-signing-key\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6d6\" (UniqueName: \"kubernetes.io/projected/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-kube-api-access-wt6d6\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059761 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059861 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.060674 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.060724 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.061391 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.062088 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-config\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.062354 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.062861 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/966098d4-f89c-4deb-a6e9-b1fae3316324-audit-dir\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.063515 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ffbc86-37af-4fa5-bc0c-58084b961597-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.064307 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-audit-policies\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.064341 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13a17846-77ac-4dba-a573-3b7d5c67da8d-auth-proxy-config\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.065422 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-config\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.064366 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.066741 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-oauth-config\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.066807 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/095068b1-bf13-43a2-a250-a0eaeb60c6ae-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.066834 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/966098d4-f89c-4deb-a6e9-b1fae3316324-node-pullsecrets\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.067118 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-encryption-config\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.067136 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-encryption-config\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.067245 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.067575 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.067662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.059454 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-image-import-ca\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.067974 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.068128 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a17846-77ac-4dba-a573-3b7d5c67da8d-config\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.068594 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jqf62"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.068970 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.069171 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-service-ca\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.069616 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.069700 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.070584 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-config\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.070604 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-serving-cert\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.070740 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-serving-cert\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.071237 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.071263 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea621d77-5364-416c-a378-6adf0e89fc30-audit-policies\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.071281 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-trusted-ca-bundle\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.071357 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22f83565-681d-490b-bd27-d21b456c6e25-audit-dir\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.071923 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/966098d4-f89c-4deb-a6e9-b1fae3316324-audit\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.072169 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xm4x6"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.072952 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xm4x6"] Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.073053 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.073264 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-client-ca\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.073313 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-oauth-serving-cert\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.073578 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.074007 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/095068b1-bf13-43a2-a250-a0eaeb60c6ae-images\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.074127 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095068b1-bf13-43a2-a250-a0eaeb60c6ae-config\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.075436 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.075612 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.075685 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.076064 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ffbc86-37af-4fa5-bc0c-58084b961597-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.076153 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.076160 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.076720 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.076970 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4443557c-c15b-435f-80a8-44916ff7c31a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-st9hs\" (UID: \"4443557c-c15b-435f-80a8-44916ff7c31a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.077108 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea621d77-5364-416c-a378-6adf0e89fc30-etcd-client\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.077154 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-serving-cert\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.077332 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9231d09e-fc87-40bf-85fe-c1d16e3ee943-metrics-tls\") pod \"dns-operator-744455d44c-776c5\" (UID: \"9231d09e-fc87-40bf-85fe-c1d16e3ee943\") " pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.077371 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/13a17846-77ac-4dba-a573-3b7d5c67da8d-machine-approver-tls\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.077448 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-serving-cert\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.077833 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.078115 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ba3ebe-86e2-4711-a87d-9505fef09f76-serving-cert\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.078673 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.079148 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.081751 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/966098d4-f89c-4deb-a6e9-b1fae3316324-etcd-client\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.106314 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.119943 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.139512 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.159206 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.160900 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1666768a-cdbb-4ab1-83d8-b1ad0444f167-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7wqv\" (UID: \"1666768a-cdbb-4ab1-83d8-b1ad0444f167\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.160956 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzfq\" (UniqueName: \"kubernetes.io/projected/93b35280-bd38-4a0c-8192-807ce4f2eb0b-kube-api-access-2kzfq\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.160982 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b616bb02-0180-4af3-922a-6a09d2da3d67-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161019 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161058 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161084 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236467b8-ffbf-4e32-ba1c-b188938f8ff5-config\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161104 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161125 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161174 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b315e659-0690-463c-88c4-659124922ddc-signing-key\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161215 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6d6\" (UniqueName: \"kubernetes.io/projected/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-kube-api-access-wt6d6\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161242 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161276 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw2v\" (UniqueName: \"kubernetes.io/projected/b315e659-0690-463c-88c4-659124922ddc-kube-api-access-mrw2v\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161304 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-images\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/236467b8-ffbf-4e32-ba1c-b188938f8ff5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-proxy-tls\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161397 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161424 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbrf\" (UniqueName: \"kubernetes.io/projected/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-kube-api-access-5jbrf\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161473 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ht66\" (UniqueName: \"kubernetes.io/projected/1666768a-cdbb-4ab1-83d8-b1ad0444f167-kube-api-access-7ht66\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7wqv\" (UID: \"1666768a-cdbb-4ab1-83d8-b1ad0444f167\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161519 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b35280-bd38-4a0c-8192-807ce4f2eb0b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161544 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161563 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161584 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b315e659-0690-463c-88c4-659124922ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161607 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b616bb02-0180-4af3-922a-6a09d2da3d67-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161628 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7slv\" (UniqueName: \"kubernetes.io/projected/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-kube-api-access-x7slv\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161661 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b616bb02-0180-4af3-922a-6a09d2da3d67-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161704 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xxwh\" (UniqueName: \"kubernetes.io/projected/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-kube-api-access-9xxwh\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.161748 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/236467b8-ffbf-4e32-ba1c-b188938f8ff5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.162650 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.172890 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.179801 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.199595 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.232887 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.239767 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.242798 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.259982 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.279167 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.299264 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.319305 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.340583 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.359432 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.380232 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.400053 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.419220 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.425778 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b616bb02-0180-4af3-922a-6a09d2da3d67-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.440235 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.443713 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b616bb02-0180-4af3-922a-6a09d2da3d67-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.480313 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.509875 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.520643 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.541129 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.560572 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.580340 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.586065 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1666768a-cdbb-4ab1-83d8-b1ad0444f167-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7wqv\" (UID: \"1666768a-cdbb-4ab1-83d8-b1ad0444f167\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.601065 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.618949 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.639939 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.660013 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.680016 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.699870 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.720605 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.725674 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/236467b8-ffbf-4e32-ba1c-b188938f8ff5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.741336 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.760974 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.782115 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.792830 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236467b8-ffbf-4e32-ba1c-b188938f8ff5-config\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.800595 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.819889 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.840007 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.860029 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.880359 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.900197 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.919850 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.940008 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.960358 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.967018 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-proxy-tls\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.977683 4956 request.go:700] Waited for 1.003878162s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-dockercfg-5nsgg&limit=500&resourceVersion=0 Mar 14 08:59:26 crc kubenswrapper[4956]: I0314 08:59:26.979843 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.000339 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.027962 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.033515 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.040624 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.044526 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.059231 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.062539 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-images\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.080536 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.099223 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.119183 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.140010 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.158953 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.161956 4956 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.161989 4956 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162022 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93b35280-bd38-4a0c-8192-807ce4f2eb0b-package-server-manager-serving-cert podName:93b35280-bd38-4a0c-8192-807ce4f2eb0b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:27.661999758 +0000 UTC m=+173.174692026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/93b35280-bd38-4a0c-8192-807ce4f2eb0b-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-zc6w2" (UID: "93b35280-bd38-4a0c-8192-807ce4f2eb0b") : failed to sync secret cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162046 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b315e659-0690-463c-88c4-659124922ddc-signing-key podName:b315e659-0690-463c-88c4-659124922ddc nodeName:}" failed. No retries permitted until 2026-03-14 08:59:27.662029609 +0000 UTC m=+173.174721897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/b315e659-0690-463c-88c4-659124922ddc-signing-key") pod "service-ca-9c57cc56f-dfkjs" (UID: "b315e659-0690-463c-88c4-659124922ddc") : failed to sync secret cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162074 4956 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162103 4956 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162093 4956 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162131 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume podName:ea6e9606-19aa-43f7-8344-ebc9f5c3f31a nodeName:}" failed. No retries permitted until 2026-03-14 08:59:27.662122991 +0000 UTC m=+173.174815269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume") pod "collect-profiles-29557965-jz6kr" (UID: "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a") : failed to sync secret cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162192 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume podName:ea6e9606-19aa-43f7-8344-ebc9f5c3f31a nodeName:}" failed. No retries permitted until 2026-03-14 08:59:27.662171652 +0000 UTC m=+173.174863950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume") pod "collect-profiles-29557965-jz6kr" (UID: "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a") : failed to sync configmap cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: E0314 08:59:27.162215 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b315e659-0690-463c-88c4-659124922ddc-signing-cabundle podName:b315e659-0690-463c-88c4-659124922ddc nodeName:}" failed. No retries permitted until 2026-03-14 08:59:27.662202243 +0000 UTC m=+173.174894541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/b315e659-0690-463c-88c4-659124922ddc-signing-cabundle") pod "service-ca-9c57cc56f-dfkjs" (UID: "b315e659-0690-463c-88c4-659124922ddc") : failed to sync configmap cache: timed out waiting for the condition Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.179496 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.199877 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.208661 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.208708 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.208918 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.208964 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.221386 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.240093 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.259824 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.280318 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.300448 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.320233 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.340583 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.359378 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.379544 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.399764 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.420094 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.439717 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.460472 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.480079 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.500519 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.519239 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.540158 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.559267 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.599658 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.619441 4956 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.639659 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.660094 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.679820 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.682744 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.682783 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b35280-bd38-4a0c-8192-807ce4f2eb0b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.682806 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b315e659-0690-463c-88c4-659124922ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.682933 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b315e659-0690-463c-88c4-659124922ddc-signing-key\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.682960 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.683940 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.684730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b315e659-0690-463c-88c4-659124922ddc-signing-cabundle\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.688311 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b35280-bd38-4a0c-8192-807ce4f2eb0b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.688363 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b315e659-0690-463c-88c4-659124922ddc-signing-key\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.689064 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.699814 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.720664 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.740339 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.759905 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.807749 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jltv\" (UniqueName: \"kubernetes.io/projected/22f83565-681d-490b-bd27-d21b456c6e25-kube-api-access-8jltv\") pod \"oauth-openshift-558db77b4-m4xmw\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.816345 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4wv\" (UniqueName: \"kubernetes.io/projected/f5b73e6d-b82a-48a3-8ae9-4afe284b4102-kube-api-access-px4wv\") pod \"openshift-apiserver-operator-796bbdcf4f-4c865\" (UID: \"f5b73e6d-b82a-48a3-8ae9-4afe284b4102\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.844381 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85crd\" (UniqueName: \"kubernetes.io/projected/9ee8610e-b521-4ee6-8dda-01c0aa50ae3e-kube-api-access-85crd\") pod \"openshift-config-operator-7777fb866f-6bzr5\" (UID: \"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.866932 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwtw\" (UniqueName: \"kubernetes.io/projected/ea621d77-5364-416c-a378-6adf0e89fc30-kube-api-access-gzwtw\") pod \"apiserver-7bbb656c7d-whkvt\" (UID: \"ea621d77-5364-416c-a378-6adf0e89fc30\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.877959 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gr2m\" (UniqueName: \"kubernetes.io/projected/44ffbc86-37af-4fa5-bc0c-58084b961597-kube-api-access-9gr2m\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqmv\" (UID: \"44ffbc86-37af-4fa5-bc0c-58084b961597\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.905080 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6b2\" (UniqueName: \"kubernetes.io/projected/66ba3ebe-86e2-4711-a87d-9505fef09f76-kube-api-access-qh6b2\") pod \"route-controller-manager-6576b87f9c-7x8jr\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.915062 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6th\" (UniqueName: \"kubernetes.io/projected/13a17846-77ac-4dba-a573-3b7d5c67da8d-kube-api-access-ct6th\") pod \"machine-approver-56656f9798-gq4nk\" (UID: \"13a17846-77ac-4dba-a573-3b7d5c67da8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.944911 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rw7\" (UniqueName: \"kubernetes.io/projected/f3953bf9-9497-4f03-a67e-e7d63bf1df9c-kube-api-access-r7rw7\") pod \"authentication-operator-69f744f599-7bkrj\" (UID: \"f3953bf9-9497-4f03-a67e-e7d63bf1df9c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.966060 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktt5\" (UniqueName: \"kubernetes.io/projected/343e0673-aad7-49c1-91b7-f5fd88579db3-kube-api-access-pktt5\") pod \"downloads-7954f5f757-lpkrv\" (UID: \"343e0673-aad7-49c1-91b7-f5fd88579db3\") " pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.973652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnj8b\" (UniqueName: \"kubernetes.io/projected/095068b1-bf13-43a2-a250-a0eaeb60c6ae-kube-api-access-fnj8b\") pod \"machine-api-operator-5694c8668f-8gwm6\" (UID: \"095068b1-bf13-43a2-a250-a0eaeb60c6ae\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.977793 4956 request.go:700] Waited for 1.906396139s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/serviceaccounts/dns-operator/token Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.993844 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2xs\" (UniqueName: \"kubernetes.io/projected/9231d09e-fc87-40bf-85fe-c1d16e3ee943-kube-api-access-lt2xs\") pod \"dns-operator-744455d44c-776c5\" (UID: \"9231d09e-fc87-40bf-85fe-c1d16e3ee943\") " pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:27 crc kubenswrapper[4956]: I0314 08:59:27.995570 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.008832 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.031472 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqfls\" (UniqueName: \"kubernetes.io/projected/4443557c-c15b-435f-80a8-44916ff7c31a-kube-api-access-fqfls\") pod \"cluster-samples-operator-665b6dd947-st9hs\" (UID: \"4443557c-c15b-435f-80a8-44916ff7c31a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.037219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhkkw\" (UniqueName: \"kubernetes.io/projected/966098d4-f89c-4deb-a6e9-b1fae3316324-kube-api-access-mhkkw\") pod \"apiserver-76f77b778f-vcs4p\" (UID: \"966098d4-f89c-4deb-a6e9-b1fae3316324\") " pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.059771 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.064559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v95dt\" (UniqueName: \"kubernetes.io/projected/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-kube-api-access-v95dt\") pod \"console-f9d7485db-8qng2\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.066836 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.078435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.080593 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.090374 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.098228 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.100735 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 08:59:28 crc kubenswrapper[4956]: W0314 08:59:28.100895 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a17846_77ac_4dba_a573_3b7d5c67da8d.slice/crio-9d900482029c84fd9ecafec4792a9728b4f51c704767731e783a07ee4062a4a6 WatchSource:0}: Error finding container 9d900482029c84fd9ecafec4792a9728b4f51c704767731e783a07ee4062a4a6: Status 404 returned error can't find the container with id 9d900482029c84fd9ecafec4792a9728b4f51c704767731e783a07ee4062a4a6 Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.108823 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.118807 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.121468 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.125896 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-776c5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.138054 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.152112 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.156459 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b616bb02-0180-4af3-922a-6a09d2da3d67-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pftg4\" (UID: \"b616bb02-0180-4af3-922a-6a09d2da3d67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.163794 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.176637 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.197842 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzfq\" (UniqueName: \"kubernetes.io/projected/93b35280-bd38-4a0c-8192-807ce4f2eb0b-kube-api-access-2kzfq\") pod \"package-server-manager-789f6589d5-zc6w2\" (UID: \"93b35280-bd38-4a0c-8192-807ce4f2eb0b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.212090 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.218197 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw2v\" (UniqueName: \"kubernetes.io/projected/b315e659-0690-463c-88c4-659124922ddc-kube-api-access-mrw2v\") pod \"service-ca-9c57cc56f-dfkjs\" (UID: \"b315e659-0690-463c-88c4-659124922ddc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.232561 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.237197 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6d6\" (UniqueName: \"kubernetes.io/projected/c30cdb70-a97a-4750-8a55-e4aaa0e44f9b-kube-api-access-wt6d6\") pod \"machine-config-operator-74547568cd-sw2ht\" (UID: \"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.257470 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbrf\" (UniqueName: \"kubernetes.io/projected/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-kube-api-access-5jbrf\") pod \"collect-profiles-29557965-jz6kr\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.262124 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.279439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ht66\" (UniqueName: \"kubernetes.io/projected/1666768a-cdbb-4ab1-83d8-b1ad0444f167-kube-api-access-7ht66\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7wqv\" (UID: \"1666768a-cdbb-4ab1-83d8-b1ad0444f167\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.283997 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8gwm6"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.292297 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.308775 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7slv\" (UniqueName: \"kubernetes.io/projected/4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f-kube-api-access-x7slv\") pod \"cluster-image-registry-operator-dc59b4c8b-c8m6n\" (UID: \"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.317870 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/236467b8-ffbf-4e32-ba1c-b188938f8ff5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s5hxg\" (UID: \"236467b8-ffbf-4e32-ba1c-b188938f8ff5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.346810 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.347297 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xxwh\" (UniqueName: \"kubernetes.io/projected/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-kube-api-access-9xxwh\") pod \"marketplace-operator-79b997595-2hj4h\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.358869 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" event={"ID":"13a17846-77ac-4dba-a573-3b7d5c67da8d","Type":"ContainerStarted","Data":"9d900482029c84fd9ecafec4792a9728b4f51c704767731e783a07ee4062a4a6"} Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.360824 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.361827 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.371462 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" event={"ID":"095068b1-bf13-43a2-a250-a0eaeb60c6ae","Type":"ContainerStarted","Data":"0d076fcb3fc86c548a79de34a81451e77bb8e3610aceb027f282e68626267ca5"} Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.381029 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.381263 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.389391 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.400581 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.406315 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.419647 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.440435 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.459236 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494058 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b827bdf7-bbd1-4b57-8485-09a312e20b43-serving-cert\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-config\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494108 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-default-certificate\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494127 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-serving-cert\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494171 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-bound-sa-token\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494191 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-stats-auth\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494206 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-tmpfs\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494222 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a78cde11-057e-4d0d-986e-bcda5f4964d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494265 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kspvx\" (UniqueName: \"kubernetes.io/projected/c0e5952a-041e-4372-b077-b13baabaabd0-kube-api-access-kspvx\") pod \"migrator-59844c95c7-tb6lx\" (UID: \"c0e5952a-041e-4372-b077-b13baabaabd0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494294 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ace8f29-86ea-40a3-8578-276187e6f1a2-srv-cert\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494313 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-ca\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494333 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf9fc759-1702-415b-8879-0f5138ad1b46-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494353 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16733c4e-be2e-4b5a-885c-6d2fab583caf-service-ca-bundle\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494401 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkvbb\" (UniqueName: \"kubernetes.io/projected/61c89047-11ed-4a9e-b861-25e144d42b5e-kube-api-access-nkvbb\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494415 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9fc759-1702-415b-8879-0f5138ad1b46-config\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494432 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-metrics-certs\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494464 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61c89047-11ed-4a9e-b861-25e144d42b5e-trusted-ca\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494498 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64tk\" (UniqueName: \"kubernetes.io/projected/a78cde11-057e-4d0d-986e-bcda5f4964d6-kube-api-access-m64tk\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494515 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wzt\" (UniqueName: \"kubernetes.io/projected/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-kube-api-access-k5wzt\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494550 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-trusted-ca\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494564 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c89047-11ed-4a9e-b861-25e144d42b5e-metrics-tls\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494578 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/581a4228-5ca8-400c-ba62-c16f4dddaddf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494593 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d773ac-a40c-471b-a132-9a49a2e315c9-serving-cert\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494611 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b74baec9-353b-4ada-a777-a0cedf80aaf8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494625 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqhf2\" (UniqueName: \"kubernetes.io/projected/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-kube-api-access-wqhf2\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494639 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29bbt\" (UniqueName: \"kubernetes.io/projected/d0d773ac-a40c-471b-a132-9a49a2e315c9-kube-api-access-29bbt\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494665 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494699 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbpxh\" (UniqueName: \"kubernetes.io/projected/1e01187b-7ff4-40bd-b726-33eedbc33f3c-kube-api-access-bbpxh\") pod \"multus-admission-controller-857f4d67dd-mqkjv\" (UID: \"1e01187b-7ff4-40bd-b726-33eedbc33f3c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494716 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-trusted-ca\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494732 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9fc759-1702-415b-8879-0f5138ad1b46-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494748 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvktv\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-kube-api-access-rvktv\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494763 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b827bdf7-bbd1-4b57-8485-09a312e20b43-config\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494778 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b74baec9-353b-4ada-a777-a0cedf80aaf8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494793 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js9gc\" (UniqueName: \"kubernetes.io/projected/7ace8f29-86ea-40a3-8578-276187e6f1a2-kube-api-access-js9gc\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494817 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-client-ca\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494860 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-tls\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-service-ca\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494889 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3520644-d9b3-41e1-8293-7bc8529ffbca-serving-cert\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494918 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a78cde11-057e-4d0d-986e-bcda5f4964d6-proxy-tls\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494942 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dnw\" (UniqueName: \"kubernetes.io/projected/581a4228-5ca8-400c-ba62-c16f4dddaddf-kube-api-access-w4dnw\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494956 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ace8f29-86ea-40a3-8578-276187e6f1a2-profile-collector-cert\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.494984 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwx5\" (UniqueName: \"kubernetes.io/projected/b827bdf7-bbd1-4b57-8485-09a312e20b43-kube-api-access-vkwx5\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495000 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-config\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495014 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-certificates\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495029 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-apiservice-cert\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495043 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-webhook-cert\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495069 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr7s2\" (UniqueName: \"kubernetes.io/projected/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-kube-api-access-zr7s2\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf425\" (UniqueName: \"kubernetes.io/projected/b3520644-d9b3-41e1-8293-7bc8529ffbca-kube-api-access-zf425\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495109 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mvrj\" (UniqueName: \"kubernetes.io/projected/16733c4e-be2e-4b5a-885c-6d2fab583caf-kube-api-access-5mvrj\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495130 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-client\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495166 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495186 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e01187b-7ff4-40bd-b726-33eedbc33f3c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mqkjv\" (UID: \"1e01187b-7ff4-40bd-b726-33eedbc33f3c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495210 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61c89047-11ed-4a9e-b861-25e144d42b5e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495227 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495250 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-config\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.495312 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/581a4228-5ca8-400c-ba62-c16f4dddaddf-srv-cert\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: E0314 08:59:28.500144 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:28.998045302 +0000 UTC m=+174.510737570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.543759 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.596393 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:28 crc kubenswrapper[4956]: E0314 08:59:28.596903 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.096876114 +0000 UTC m=+174.609568382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.596980 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b74baec9-353b-4ada-a777-a0cedf80aaf8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597042 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js9gc\" (UniqueName: \"kubernetes.io/projected/7ace8f29-86ea-40a3-8578-276187e6f1a2-kube-api-access-js9gc\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597119 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-client-ca\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597197 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dczk\" (UniqueName: \"kubernetes.io/projected/93ee50e1-8448-4b8e-89be-bba3e87c0843-kube-api-access-2dczk\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597218 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-tls\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597260 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-service-ca\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597279 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3520644-d9b3-41e1-8293-7bc8529ffbca-serving-cert\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597295 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a78cde11-057e-4d0d-986e-bcda5f4964d6-proxy-tls\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597348 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dnw\" (UniqueName: \"kubernetes.io/projected/581a4228-5ca8-400c-ba62-c16f4dddaddf-kube-api-access-w4dnw\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597367 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ace8f29-86ea-40a3-8578-276187e6f1a2-profile-collector-cert\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597385 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwx5\" (UniqueName: \"kubernetes.io/projected/b827bdf7-bbd1-4b57-8485-09a312e20b43-kube-api-access-vkwx5\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597667 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-plugins-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597701 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-config\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597727 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59da700a-d9ae-49ac-a504-b7d9ea7ff0b6-cert\") pod \"ingress-canary-xm4x6\" (UID: \"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6\") " pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597769 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-csi-data-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597794 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-certificates\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597817 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-apiservice-cert\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597853 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-webhook-cert\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597876 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr7s2\" (UniqueName: \"kubernetes.io/projected/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-kube-api-access-zr7s2\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597910 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf425\" (UniqueName: \"kubernetes.io/projected/b3520644-d9b3-41e1-8293-7bc8529ffbca-kube-api-access-zf425\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.597968 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mvrj\" (UniqueName: \"kubernetes.io/projected/16733c4e-be2e-4b5a-885c-6d2fab583caf-kube-api-access-5mvrj\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598120 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-client\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598144 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598814 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e01187b-7ff4-40bd-b726-33eedbc33f3c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mqkjv\" (UID: \"1e01187b-7ff4-40bd-b726-33eedbc33f3c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598860 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61c89047-11ed-4a9e-b861-25e144d42b5e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598911 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598969 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-config\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.598993 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-certs\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599042 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wj4\" (UniqueName: \"kubernetes.io/projected/60855808-2138-47e7-9539-df7c86ebe635-kube-api-access-m6wj4\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599085 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/581a4228-5ca8-400c-ba62-c16f4dddaddf-srv-cert\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599122 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-registration-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599174 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b827bdf7-bbd1-4b57-8485-09a312e20b43-serving-cert\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599194 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-config\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599365 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-node-bootstrap-token\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599402 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-default-certificate\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599418 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-serving-cert\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599641 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-bound-sa-token\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599703 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-stats-auth\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599740 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-tmpfs\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.599759 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a78cde11-057e-4d0d-986e-bcda5f4964d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600000 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kspvx\" (UniqueName: \"kubernetes.io/projected/c0e5952a-041e-4372-b077-b13baabaabd0-kube-api-access-kspvx\") pod \"migrator-59844c95c7-tb6lx\" (UID: \"c0e5952a-041e-4372-b077-b13baabaabd0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600025 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ace8f29-86ea-40a3-8578-276187e6f1a2-srv-cert\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600050 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-ca\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf9fc759-1702-415b-8879-0f5138ad1b46-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600116 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16733c4e-be2e-4b5a-885c-6d2fab583caf-service-ca-bundle\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600142 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93ee50e1-8448-4b8e-89be-bba3e87c0843-config-volume\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600188 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600207 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkvbb\" (UniqueName: \"kubernetes.io/projected/61c89047-11ed-4a9e-b861-25e144d42b5e-kube-api-access-nkvbb\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600226 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9fc759-1702-415b-8879-0f5138ad1b46-config\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93ee50e1-8448-4b8e-89be-bba3e87c0843-metrics-tls\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600288 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-metrics-certs\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600305 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvqz\" (UniqueName: \"kubernetes.io/projected/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-kube-api-access-vbvqz\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61c89047-11ed-4a9e-b861-25e144d42b5e-trusted-ca\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600362 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m64tk\" (UniqueName: \"kubernetes.io/projected/a78cde11-057e-4d0d-986e-bcda5f4964d6-kube-api-access-m64tk\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600379 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wzt\" (UniqueName: \"kubernetes.io/projected/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-kube-api-access-k5wzt\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-trusted-ca\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600430 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98zn7\" (UniqueName: \"kubernetes.io/projected/59da700a-d9ae-49ac-a504-b7d9ea7ff0b6-kube-api-access-98zn7\") pod \"ingress-canary-xm4x6\" (UID: \"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6\") " pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600446 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c89047-11ed-4a9e-b861-25e144d42b5e-metrics-tls\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600463 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/581a4228-5ca8-400c-ba62-c16f4dddaddf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600495 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d773ac-a40c-471b-a132-9a49a2e315c9-serving-cert\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600519 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b74baec9-353b-4ada-a777-a0cedf80aaf8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600551 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqhf2\" (UniqueName: \"kubernetes.io/projected/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-kube-api-access-wqhf2\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600551 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-client-ca\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29bbt\" (UniqueName: \"kubernetes.io/projected/d0d773ac-a40c-471b-a132-9a49a2e315c9-kube-api-access-29bbt\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600599 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600659 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbpxh\" (UniqueName: \"kubernetes.io/projected/1e01187b-7ff4-40bd-b726-33eedbc33f3c-kube-api-access-bbpxh\") pod \"multus-admission-controller-857f4d67dd-mqkjv\" (UID: \"1e01187b-7ff4-40bd-b726-33eedbc33f3c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600684 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-trusted-ca\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600701 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9fc759-1702-415b-8879-0f5138ad1b46-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600716 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-socket-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600731 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-mountpoint-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600782 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvktv\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-kube-api-access-rvktv\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.600805 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b827bdf7-bbd1-4b57-8485-09a312e20b43-config\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.601720 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-config\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.610573 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16733c4e-be2e-4b5a-885c-6d2fab583caf-service-ca-bundle\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.612314 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-tls\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.612574 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.612925 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/581a4228-5ca8-400c-ba62-c16f4dddaddf-srv-cert\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.613463 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-apiservice-cert\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.614443 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-service-ca\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.614516 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9fc759-1702-415b-8879-0f5138ad1b46-config\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.616162 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b74baec9-353b-4ada-a777-a0cedf80aaf8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.616640 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1e01187b-7ff4-40bd-b726-33eedbc33f3c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mqkjv\" (UID: \"1e01187b-7ff4-40bd-b726-33eedbc33f3c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.620267 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d773ac-a40c-471b-a132-9a49a2e315c9-serving-cert\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.631123 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/581a4228-5ca8-400c-ba62-c16f4dddaddf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.631317 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3520644-d9b3-41e1-8293-7bc8529ffbca-serving-cert\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.632021 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-webhook-cert\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.632076 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-certificates\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.632458 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b74baec9-353b-4ada-a777-a0cedf80aaf8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.632746 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.632904 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-stats-auth\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.635326 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.635437 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-default-certificate\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.637647 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-tmpfs\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.638704 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-trusted-ca\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.640974 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.647941 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.648245 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-ca\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.648945 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a78cde11-057e-4d0d-986e-bcda5f4964d6-proxy-tls\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.649443 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9fc759-1702-415b-8879-0f5138ad1b46-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.652219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3520644-d9b3-41e1-8293-7bc8529ffbca-config\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.654800 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b827bdf7-bbd1-4b57-8485-09a312e20b43-config\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.661801 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.661896 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-serving-cert\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: E0314 08:59:28.662849 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.162831276 +0000 UTC m=+174.675523544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.663312 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a78cde11-057e-4d0d-986e-bcda5f4964d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.664498 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7ace8f29-86ea-40a3-8578-276187e6f1a2-srv-cert\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.665601 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61c89047-11ed-4a9e-b861-25e144d42b5e-trusted-ca\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.665926 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js9gc\" (UniqueName: \"kubernetes.io/projected/7ace8f29-86ea-40a3-8578-276187e6f1a2-kube-api-access-js9gc\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.666267 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-config\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.667629 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-trusted-ca\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.670234 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3520644-d9b3-41e1-8293-7bc8529ffbca-etcd-client\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.673044 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b827bdf7-bbd1-4b57-8485-09a312e20b43-serving-cert\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.673411 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7ace8f29-86ea-40a3-8578-276187e6f1a2-profile-collector-cert\") pod \"catalog-operator-68c6474976-vj2ld\" (UID: \"7ace8f29-86ea-40a3-8578-276187e6f1a2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.674879 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16733c4e-be2e-4b5a-885c-6d2fab583caf-metrics-certs\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.676207 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c89047-11ed-4a9e-b861-25e144d42b5e-metrics-tls\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.689195 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61c89047-11ed-4a9e-b861-25e144d42b5e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.696923 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.697050 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr7s2\" (UniqueName: \"kubernetes.io/projected/e5ed96ac-e78f-424f-9aa0-f85a6f0321db-kube-api-access-zr7s2\") pod \"packageserver-d55dfcdfc-b4kwx\" (UID: \"e5ed96ac-e78f-424f-9aa0-f85a6f0321db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.701822 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702004 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-socket-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702026 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-mountpoint-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702050 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dczk\" (UniqueName: \"kubernetes.io/projected/93ee50e1-8448-4b8e-89be-bba3e87c0843-kube-api-access-2dczk\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-plugins-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702107 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59da700a-d9ae-49ac-a504-b7d9ea7ff0b6-cert\") pod \"ingress-canary-xm4x6\" (UID: \"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6\") " pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702120 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-csi-data-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702167 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-certs\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702181 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wj4\" (UniqueName: \"kubernetes.io/projected/60855808-2138-47e7-9539-df7c86ebe635-kube-api-access-m6wj4\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702199 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-registration-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702216 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-node-bootstrap-token\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702264 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93ee50e1-8448-4b8e-89be-bba3e87c0843-config-volume\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93ee50e1-8448-4b8e-89be-bba3e87c0843-metrics-tls\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702308 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvqz\" (UniqueName: \"kubernetes.io/projected/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-kube-api-access-vbvqz\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98zn7\" (UniqueName: \"kubernetes.io/projected/59da700a-d9ae-49ac-a504-b7d9ea7ff0b6-kube-api-access-98zn7\") pod \"ingress-canary-xm4x6\" (UID: \"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6\") " pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:28 crc kubenswrapper[4956]: E0314 08:59:28.702563 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.202548645 +0000 UTC m=+174.715240913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702759 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-socket-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702802 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-mountpoint-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.702878 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-registration-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.703046 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-plugins-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.703179 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/60855808-2138-47e7-9539-df7c86ebe635-csi-data-dir\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.704928 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93ee50e1-8448-4b8e-89be-bba3e87c0843-config-volume\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.706712 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59da700a-d9ae-49ac-a504-b7d9ea7ff0b6-cert\") pod \"ingress-canary-xm4x6\" (UID: \"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6\") " pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.707177 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93ee50e1-8448-4b8e-89be-bba3e87c0843-metrics-tls\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.708254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-certs\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.708341 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-node-bootstrap-token\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.716057 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dnw\" (UniqueName: \"kubernetes.io/projected/581a4228-5ca8-400c-ba62-c16f4dddaddf-kube-api-access-w4dnw\") pod \"olm-operator-6b444d44fb-jkv2j\" (UID: \"581a4228-5ca8-400c-ba62-c16f4dddaddf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.716498 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.733458 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwx5\" (UniqueName: \"kubernetes.io/projected/b827bdf7-bbd1-4b57-8485-09a312e20b43-kube-api-access-vkwx5\") pod \"service-ca-operator-777779d784-zd8f5\" (UID: \"b827bdf7-bbd1-4b57-8485-09a312e20b43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.757291 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kspvx\" (UniqueName: \"kubernetes.io/projected/c0e5952a-041e-4372-b077-b13baabaabd0-kube-api-access-kspvx\") pod \"migrator-59844c95c7-tb6lx\" (UID: \"c0e5952a-041e-4372-b077-b13baabaabd0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.775090 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64tk\" (UniqueName: \"kubernetes.io/projected/a78cde11-057e-4d0d-986e-bcda5f4964d6-kube-api-access-m64tk\") pod \"machine-config-controller-84d6567774-nvgzt\" (UID: \"a78cde11-057e-4d0d-986e-bcda5f4964d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.803813 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: E0314 08:59:28.804284 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.304268559 +0000 UTC m=+174.816960827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.823823 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqhf2\" (UniqueName: \"kubernetes.io/projected/e7fa9107-c6f8-4ecd-bb56-62b7be869ec9-kube-api-access-wqhf2\") pod \"console-operator-58897d9998-d9kk5\" (UID: \"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9\") " pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.829310 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.834686 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.842371 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29bbt\" (UniqueName: \"kubernetes.io/projected/d0d773ac-a40c-471b-a132-9a49a2e315c9-kube-api-access-29bbt\") pod \"controller-manager-879f6c89f-5pqds\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.842424 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bkrj"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.846312 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkvbb\" (UniqueName: \"kubernetes.io/projected/61c89047-11ed-4a9e-b861-25e144d42b5e-kube-api-access-nkvbb\") pod \"ingress-operator-5b745b69d9-jspq2\" (UID: \"61c89047-11ed-4a9e-b861-25e144d42b5e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.848738 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.850038 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.852547 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-776c5"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.853340 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m4xmw"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.856705 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wzt\" (UniqueName: \"kubernetes.io/projected/7f915e7e-5c5a-4b5c-8b46-459a5bddb130-kube-api-access-k5wzt\") pod \"kube-storage-version-migrator-operator-b67b599dd-8hw77\" (UID: \"7f915e7e-5c5a-4b5c-8b46-459a5bddb130\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.881122 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.895559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf425\" (UniqueName: \"kubernetes.io/projected/b3520644-d9b3-41e1-8293-7bc8529ffbca-kube-api-access-zf425\") pod \"etcd-operator-b45778765-j9qjz\" (UID: \"b3520644-d9b3-41e1-8293-7bc8529ffbca\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.899032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbpxh\" (UniqueName: \"kubernetes.io/projected/1e01187b-7ff4-40bd-b726-33eedbc33f3c-kube-api-access-bbpxh\") pod \"multus-admission-controller-857f4d67dd-mqkjv\" (UID: \"1e01187b-7ff4-40bd-b726-33eedbc33f3c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.905646 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.908032 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lpkrv"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.908424 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" Mar 14 08:59:28 crc kubenswrapper[4956]: E0314 08:59:28.908920 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.408895694 +0000 UTC m=+174.921587962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.912577 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8qng2"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.912963 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-bound-sa-token\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.916772 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.925906 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.932850 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf9fc759-1702-415b-8879-0f5138ad1b46-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ksv2k\" (UID: \"cf9fc759-1702-415b-8879-0f5138ad1b46\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.951922 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.958184 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvktv\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-kube-api-access-rvktv\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.959505 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.965931 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.974515 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.992335 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.992850 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs"] Mar 14 08:59:28 crc kubenswrapper[4956]: I0314 08:59:28.993617 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mvrj\" (UniqueName: \"kubernetes.io/projected/16733c4e-be2e-4b5a-885c-6d2fab583caf-kube-api-access-5mvrj\") pod \"router-default-5444994796-hsfss\" (UID: \"16733c4e-be2e-4b5a-885c-6d2fab583caf\") " pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.007719 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcs4p"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.012999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.013435 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.513414458 +0000 UTC m=+175.026106786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.018796 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98zn7\" (UniqueName: \"kubernetes.io/projected/59da700a-d9ae-49ac-a504-b7d9ea7ff0b6-kube-api-access-98zn7\") pod \"ingress-canary-xm4x6\" (UID: \"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6\") " pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.026204 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb616bb02_0180_4af3_922a_6a09d2da3d67.slice/crio-6df7a3343306b981972e6379f81def9c7da2cd42d2f8614bf74158ee8a5c2f47 WatchSource:0}: Error finding container 6df7a3343306b981972e6379f81def9c7da2cd42d2f8614bf74158ee8a5c2f47: Status 404 returned error can't find the container with id 6df7a3343306b981972e6379f81def9c7da2cd42d2f8614bf74158ee8a5c2f47 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.034864 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dczk\" (UniqueName: \"kubernetes.io/projected/93ee50e1-8448-4b8e-89be-bba3e87c0843-kube-api-access-2dczk\") pod \"dns-default-m5sxz\" (UID: \"93ee50e1-8448-4b8e-89be-bba3e87c0843\") " pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.050379 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.060418 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wj4\" (UniqueName: \"kubernetes.io/projected/60855808-2138-47e7-9539-df7c86ebe635-kube-api-access-m6wj4\") pod \"csi-hostpathplugin-jqf62\" (UID: \"60855808-2138-47e7-9539-df7c86ebe635\") " pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.066468 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xm4x6" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.073218 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.077785 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvqz\" (UniqueName: \"kubernetes.io/projected/8f87de9e-e590-4bb0-8f62-a5f4bf1922cb-kube-api-access-vbvqz\") pod \"machine-config-server-d7cv4\" (UID: \"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb\") " pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.083886 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.086744 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dfkjs"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.105194 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.114669 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.114835 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.614810883 +0000 UTC m=+175.127503151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.114932 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.115326 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.615311905 +0000 UTC m=+175.128004173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.119805 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.135324 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93b35280_bd38_4a0c_8192_807ce4f2eb0b.slice/crio-1d288a08637cdedde6d746063c5df63619e92c75f1b07d85401a38ac4572ce92 WatchSource:0}: Error finding container 1d288a08637cdedde6d746063c5df63619e92c75f1b07d85401a38ac4572ce92: Status 404 returned error can't find the container with id 1d288a08637cdedde6d746063c5df63619e92c75f1b07d85401a38ac4572ce92 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.153618 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.161394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr"] Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.171532 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30cdb70_a97a_4750_8a55_e4aaa0e44f9b.slice/crio-5258b8043086b6e892eb307c875ef4595fae95bbaeea1221798c2acbb12f9a5c WatchSource:0}: Error finding container 5258b8043086b6e892eb307c875ef4595fae95bbaeea1221798c2acbb12f9a5c: Status 404 returned error can't find the container with id 5258b8043086b6e892eb307c875ef4595fae95bbaeea1221798c2acbb12f9a5c Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.216475 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.218519 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.718467475 +0000 UTC m=+175.231159743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.221844 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.222147 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.722136276 +0000 UTC m=+175.234828544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.226041 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.244593 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.267752 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.273689 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.293928 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hj4h"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.308892 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.322843 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.323457 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.823437339 +0000 UTC m=+175.336129607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.326310 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.334809 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.339108 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.359947 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d9kk5"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.359976 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d7cv4" Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.379899 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" event={"ID":"f5b73e6d-b82a-48a3-8ae9-4afe284b4102","Type":"ContainerStarted","Data":"67cfe840a3dc5d55407371c04bc9d0f7e0fdf0529fb69033cebd8a1560088b8d"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.379974 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" event={"ID":"f5b73e6d-b82a-48a3-8ae9-4afe284b4102","Type":"ContainerStarted","Data":"457f33e4c21e94c6b9cc1423ef4c3fbbdbdf34fbae0fd73268ee6bd31a70ac97"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.382179 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ee8610e-b521-4ee6-8dda-01c0aa50ae3e" containerID="c580645544cec9b006be0d56267620982211c49e5b62a8c9aa4185777286a44d" exitCode=0 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.382715 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" event={"ID":"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e","Type":"ContainerDied","Data":"c580645544cec9b006be0d56267620982211c49e5b62a8c9aa4185777286a44d"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.382787 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" event={"ID":"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e","Type":"ContainerStarted","Data":"3e74fe132610cdeba6362c3c56655821a290e26868696ccf936011cc6293098e"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.390329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" event={"ID":"b315e659-0690-463c-88c4-659124922ddc","Type":"ContainerStarted","Data":"a906fc6e41677c1e0183b5284e4b9856d02b0fcef0b600833cc61a8348f6f691"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.402494 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" event={"ID":"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a","Type":"ContainerStarted","Data":"9786cc6ed033e06cdbd092513e846858193131654d5d7dffb77be1819562314d"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.424157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.425921 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:29.925902441 +0000 UTC m=+175.438594709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.428240 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" event={"ID":"095068b1-bf13-43a2-a250-a0eaeb60c6ae","Type":"ContainerStarted","Data":"5f07d21403085b8d3b5da7752abf68bee8ab804a1baf5c34aa6e99dce3a9d290"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.428287 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" event={"ID":"095068b1-bf13-43a2-a250-a0eaeb60c6ae","Type":"ContainerStarted","Data":"28b3426a1cf085be3ede843bb1e7af6a282ef197fd2f3e2723c75ac59d90ff55"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.433022 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" event={"ID":"93b35280-bd38-4a0c-8192-807ce4f2eb0b","Type":"ContainerStarted","Data":"1d288a08637cdedde6d746063c5df63619e92c75f1b07d85401a38ac4572ce92"} Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.433115 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod581a4228_5ca8_400c_ba62_c16f4dddaddf.slice/crio-e85f174b0f678521b2341b42ac95e44aede6201daf8847ec5318b3d7d8b2d41e WatchSource:0}: Error finding container e85f174b0f678521b2341b42ac95e44aede6201daf8847ec5318b3d7d8b2d41e: Status 404 returned error can't find the container with id e85f174b0f678521b2341b42ac95e44aede6201daf8847ec5318b3d7d8b2d41e Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.441528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" event={"ID":"b616bb02-0180-4af3-922a-6a09d2da3d67","Type":"ContainerStarted","Data":"6df7a3343306b981972e6379f81def9c7da2cd42d2f8614bf74158ee8a5c2f47"} Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.445953 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb00c56_ab05_4c9d_a1d4_a80a53775d4e.slice/crio-72a93aa6750359f1b00e770e04c43c8e02976f6c78aec2db408f30f0a67db7a4 WatchSource:0}: Error finding container 72a93aa6750359f1b00e770e04c43c8e02976f6c78aec2db408f30f0a67db7a4: Status 404 returned error can't find the container with id 72a93aa6750359f1b00e770e04c43c8e02976f6c78aec2db408f30f0a67db7a4 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.450415 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" event={"ID":"44ffbc86-37af-4fa5-bc0c-58084b961597","Type":"ContainerStarted","Data":"3e5a264d7d6679479b76a924c7e3100423c0dd635ffd0e76e62f89c36c7ebed3"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.450474 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" event={"ID":"44ffbc86-37af-4fa5-bc0c-58084b961597","Type":"ContainerStarted","Data":"799f930330ad5ae354280e08a64b1e2060eb83371a48f63fad7fc480a479d7d3"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.452577 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-776c5" event={"ID":"9231d09e-fc87-40bf-85fe-c1d16e3ee943","Type":"ContainerStarted","Data":"a25d7d65a342b36b925f1318f810c0bc606cb07c93907231be5b3ea9e81c4a1f"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.453868 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8qng2" event={"ID":"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae","Type":"ContainerStarted","Data":"da001f9e097418bf6d812ac9f91b8a6dfa1c8d5680dd4c0738389f39a7393596"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.454531 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lpkrv" event={"ID":"343e0673-aad7-49c1-91b7-f5fd88579db3","Type":"ContainerStarted","Data":"fdc51b2f7052c3416cb4c217b89d1f79eba88f0b911f73231dba4123611ae46a"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.455411 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" event={"ID":"1666768a-cdbb-4ab1-83d8-b1ad0444f167","Type":"ContainerStarted","Data":"bb6f32fe45d3566a81b40eb6b44a15130980bf4abef41acf95c0307f9a606ca4"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.456655 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" event={"ID":"966098d4-f89c-4deb-a6e9-b1fae3316324","Type":"ContainerStarted","Data":"fba362328fa6c40562e4a56eb578e0d579f2eafb87875a4930d7eac4f68d3e17"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.458132 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" event={"ID":"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b","Type":"ContainerStarted","Data":"5258b8043086b6e892eb307c875ef4595fae95bbaeea1221798c2acbb12f9a5c"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.459104 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" event={"ID":"22f83565-681d-490b-bd27-d21b456c6e25","Type":"ContainerStarted","Data":"b72c1ece8041995c281416a454d56d9bff339390cd8f647c858483557668b2c8"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.460321 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" event={"ID":"66ba3ebe-86e2-4711-a87d-9505fef09f76","Type":"ContainerStarted","Data":"c87e0e9d4b2d4e3cc782b4478cf96bd5cf1779098848ddf14813a9d8f11ff693"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.472852 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" event={"ID":"236467b8-ffbf-4e32-ba1c-b188938f8ff5","Type":"ContainerStarted","Data":"d1968d4b2993b7c6310b60a3cd9d1fa96b1a300c4051c1dff12fa872ff2a8cb1"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.484072 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" event={"ID":"f3953bf9-9497-4f03-a67e-e7d63bf1df9c","Type":"ContainerStarted","Data":"12f1f3d13cacbbc3ea35e7dfd9597d971382502933c2a862c83139d53ef3f082"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.484117 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" event={"ID":"f3953bf9-9497-4f03-a67e-e7d63bf1df9c","Type":"ContainerStarted","Data":"3f8f35abb349538664cd8498fae9237f6579bf6174a5eec87e6da63a4fa79e63"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.494424 4956 generic.go:334] "Generic (PLEG): container finished" podID="ea621d77-5364-416c-a378-6adf0e89fc30" containerID="33b935ec6bc807c8dfb10e375c95100a03f11c1d48d7d0fc0c4f9225f7876b43" exitCode=0 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.494531 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" event={"ID":"ea621d77-5364-416c-a378-6adf0e89fc30","Type":"ContainerDied","Data":"33b935ec6bc807c8dfb10e375c95100a03f11c1d48d7d0fc0c4f9225f7876b43"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.494561 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" event={"ID":"ea621d77-5364-416c-a378-6adf0e89fc30","Type":"ContainerStarted","Data":"f1da19e7a011d81b2a3e1da062db3e6dfd30834cdddce3edb0e76f3f965a832b"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.500151 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" event={"ID":"13a17846-77ac-4dba-a573-3b7d5c67da8d","Type":"ContainerStarted","Data":"07cfbee3e011b889ef70c2ec812fabfa3b4814a61145bacfaec7b7347ad174b4"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.500189 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" event={"ID":"13a17846-77ac-4dba-a573-3b7d5c67da8d","Type":"ContainerStarted","Data":"1a1a3909a52cbea8482804cf69761e530fb95963667a334373a0d094fc446848"} Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.525338 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.525520 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.025500751 +0000 UTC m=+175.538193019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.526046 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.527183 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.027170133 +0000 UTC m=+175.539862501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.600246 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mqkjv"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.627568 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.639953 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.640674 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.14066011 +0000 UTC m=+175.653352378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.698762 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j9qjz"] Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.717061 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e01187b_7ff4_40bd_b726_33eedbc33f3c.slice/crio-6adaf4d0e6d863a2f81e9ad4ea6c86bc2150e77d039b0f221debdbe9e483e7b6 WatchSource:0}: Error finding container 6adaf4d0e6d863a2f81e9ad4ea6c86bc2150e77d039b0f221debdbe9e483e7b6: Status 404 returned error can't find the container with id 6adaf4d0e6d863a2f81e9ad4ea6c86bc2150e77d039b0f221debdbe9e483e7b6 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.720973 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.723497 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.745882 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx"] Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.748095 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.748575 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.248557657 +0000 UTC m=+175.761249915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.756467 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xm4x6"] Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.766214 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3520644_d9b3_41e1_8293_7bc8529ffbca.slice/crio-892360b37d74b52aa11b29d625d61e6cfde45317de0babb5d2275f13495bb337 WatchSource:0}: Error finding container 892360b37d74b52aa11b29d625d61e6cfde45317de0babb5d2275f13495bb337: Status 404 returned error can't find the container with id 892360b37d74b52aa11b29d625d61e6cfde45317de0babb5d2275f13495bb337 Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.787752 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb827bdf7_bbd1_4b57_8485_09a312e20b43.slice/crio-7a4e1bb70f0d3827cc85eba35797cfbdb573c77a0c212640614d9f6504270c84 WatchSource:0}: Error finding container 7a4e1bb70f0d3827cc85eba35797cfbdb573c77a0c212640614d9f6504270c84: Status 404 returned error can't find the container with id 7a4e1bb70f0d3827cc85eba35797cfbdb573c77a0c212640614d9f6504270c84 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.798719 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5pqds"] Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.801470 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ed96ac_e78f_424f_9aa0_f85a6f0321db.slice/crio-8e847b3e85de9c1473b7edd95d6f4858f9b17f7119278df81c99cc5e04c89309 WatchSource:0}: Error finding container 8e847b3e85de9c1473b7edd95d6f4858f9b17f7119278df81c99cc5e04c89309: Status 404 returned error can't find the container with id 8e847b3e85de9c1473b7edd95d6f4858f9b17f7119278df81c99cc5e04c89309 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.808616 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m5sxz"] Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.810356 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59da700a_d9ae_49ac_a504_b7d9ea7ff0b6.slice/crio-9d3ee850a3f5537a76b13d7550f2dcb175cac1bc5b2547875ff477d58dc9fcdb WatchSource:0}: Error finding container 9d3ee850a3f5537a76b13d7550f2dcb175cac1bc5b2547875ff477d58dc9fcdb: Status 404 returned error can't find the container with id 9d3ee850a3f5537a76b13d7550f2dcb175cac1bc5b2547875ff477d58dc9fcdb Mar 14 08:59:29 crc kubenswrapper[4956]: W0314 08:59:29.827504 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d773ac_a40c_471b_a132_9a49a2e315c9.slice/crio-a770668774fc6bde9d28c39fc19ffe5fee988109383160d061657bdc34abe559 WatchSource:0}: Error finding container a770668774fc6bde9d28c39fc19ffe5fee988109383160d061657bdc34abe559: Status 404 returned error can't find the container with id a770668774fc6bde9d28c39fc19ffe5fee988109383160d061657bdc34abe559 Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.849257 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.849355 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.349329927 +0000 UTC m=+175.862022195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.849634 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.850201 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.350185398 +0000 UTC m=+175.862877666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:29 crc kubenswrapper[4956]: I0314 08:59:29.957164 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:29 crc kubenswrapper[4956]: E0314 08:59:29.962644 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.462624678 +0000 UTC m=+175.975316946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.065103 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jqf62"] Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.081432 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.081915 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.581903089 +0000 UTC m=+176.094595357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: W0314 08:59:30.162822 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60855808_2138_47e7_9539_df7c86ebe635.slice/crio-cdfde385164f4597390f627faf559fa225240fab71ce194039e817bac36f0647 WatchSource:0}: Error finding container cdfde385164f4597390f627faf559fa225240fab71ce194039e817bac36f0647: Status 404 returned error can't find the container with id cdfde385164f4597390f627faf559fa225240fab71ce194039e817bac36f0647 Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.187651 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.188071 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.688056803 +0000 UTC m=+176.200749071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.234862 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k"] Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.290587 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.290864 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.790853313 +0000 UTC m=+176.303545571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.360047 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gq4nk" podStartSLOduration=102.360019786 podStartE2EDuration="1m42.360019786s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.353048632 +0000 UTC m=+175.865740920" watchObservedRunningTime="2026-03-14 08:59:30.360019786 +0000 UTC m=+175.872712064" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.391654 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.394822 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.894774481 +0000 UTC m=+176.407466829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.396164 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8gwm6" podStartSLOduration=102.396144505 podStartE2EDuration="1m42.396144505s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.394817932 +0000 UTC m=+175.907510230" watchObservedRunningTime="2026-03-14 08:59:30.396144505 +0000 UTC m=+175.908836773" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.492648 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.493016 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:30.993001197 +0000 UTC m=+176.505693465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.523616 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bkrj" podStartSLOduration=102.523592409 podStartE2EDuration="1m42.523592409s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.520178974 +0000 UTC m=+176.032871252" watchObservedRunningTime="2026-03-14 08:59:30.523592409 +0000 UTC m=+176.036284677" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.582061 4956 generic.go:334] "Generic (PLEG): container finished" podID="966098d4-f89c-4deb-a6e9-b1fae3316324" containerID="634960be0218cf0917ca3cf7fceab3535a847407d41bf064ad5be8ae7d2b04ba" exitCode=0 Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.582169 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" event={"ID":"966098d4-f89c-4deb-a6e9-b1fae3316324","Type":"ContainerDied","Data":"634960be0218cf0917ca3cf7fceab3535a847407d41bf064ad5be8ae7d2b04ba"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.594932 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.595387 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.095371117 +0000 UTC m=+176.608063385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.597718 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" event={"ID":"581a4228-5ca8-400c-ba62-c16f4dddaddf","Type":"ContainerStarted","Data":"e85f174b0f678521b2341b42ac95e44aede6201daf8847ec5318b3d7d8b2d41e"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.599577 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55746: no serving certificate available for the kubelet" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.605807 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" event={"ID":"b315e659-0690-463c-88c4-659124922ddc","Type":"ContainerStarted","Data":"300dd932266021897b4e5148f3deb19ff57a5f7db6f24555a1b730d9f7f4bee8"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.607378 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" event={"ID":"1666768a-cdbb-4ab1-83d8-b1ad0444f167","Type":"ContainerStarted","Data":"aceefd8b1f4a9b49cff7abfe6de72fcf33437144ac1ba6eb523486d4d64dbb3e"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.615452 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" event={"ID":"7f915e7e-5c5a-4b5c-8b46-459a5bddb130","Type":"ContainerStarted","Data":"e444af536820dec39071b9f5163702c13950d49e84f4394942f871a2a3cdeba9"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.634385 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" event={"ID":"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e","Type":"ContainerStarted","Data":"72a93aa6750359f1b00e770e04c43c8e02976f6c78aec2db408f30f0a67db7a4"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.646673 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" event={"ID":"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b","Type":"ContainerStarted","Data":"db57c274df0b49d9fd9f08a6835b48a5a40ee141c1ccd7d7a2776d0bf027ec32"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.648324 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-776c5" event={"ID":"9231d09e-fc87-40bf-85fe-c1d16e3ee943","Type":"ContainerStarted","Data":"f30b83ee8594b68295bb2e72d608e8743d5b02e8db8fde11f07e049ca1e9a87c"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.649565 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" event={"ID":"1e01187b-7ff4-40bd-b726-33eedbc33f3c","Type":"ContainerStarted","Data":"6adaf4d0e6d863a2f81e9ad4ea6c86bc2150e77d039b0f221debdbe9e483e7b6"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.653583 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" event={"ID":"d0d773ac-a40c-471b-a132-9a49a2e315c9","Type":"ContainerStarted","Data":"a770668774fc6bde9d28c39fc19ffe5fee988109383160d061657bdc34abe559"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.657609 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" event={"ID":"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f","Type":"ContainerStarted","Data":"2eb6d5161b9d9b18f43f25fcb32598bdf3b783926a7b64d44cf711e192fb2ec0"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.658815 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" event={"ID":"b827bdf7-bbd1-4b57-8485-09a312e20b43","Type":"ContainerStarted","Data":"7a4e1bb70f0d3827cc85eba35797cfbdb573c77a0c212640614d9f6504270c84"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.662094 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" event={"ID":"4443557c-c15b-435f-80a8-44916ff7c31a","Type":"ContainerStarted","Data":"fa75fbe4cb17dba32c4c07b3bd3bc6d3deb55a2aa032937bd3a4ea0c582134db"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.662141 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" event={"ID":"4443557c-c15b-435f-80a8-44916ff7c31a","Type":"ContainerStarted","Data":"ade586f97e06bcf0a55273569ef0da18306cb7bb5fbf24cb07b1d70be3710e1a"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.668230 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8qng2" event={"ID":"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae","Type":"ContainerStarted","Data":"bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.675179 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m5sxz" event={"ID":"93ee50e1-8448-4b8e-89be-bba3e87c0843","Type":"ContainerStarted","Data":"61c7a2574914717cb638262870e093503d3fca312dce4f6887443e41cc425ff3"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.679874 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" event={"ID":"9ee8610e-b521-4ee6-8dda-01c0aa50ae3e","Type":"ContainerStarted","Data":"93a4efef376031e73d8b4e432bcb8cd8c0a8d1f64dbd57d62a413ee824cd4876"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.680421 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.683387 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55760: no serving certificate available for the kubelet" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.684922 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" event={"ID":"c0e5952a-041e-4372-b077-b13baabaabd0","Type":"ContainerStarted","Data":"3fdda24c488b2f8fcf8ac663ac75992a27110161ed4b05f807fc11205d9e6375"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.689381 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" event={"ID":"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a","Type":"ContainerStarted","Data":"1e6949f7b0a995c2b2b1709357213db06a2b48332e2fcd2e53fa73affdbc1e82"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.701321 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" event={"ID":"b3520644-d9b3-41e1-8293-7bc8529ffbca","Type":"ContainerStarted","Data":"892360b37d74b52aa11b29d625d61e6cfde45317de0babb5d2275f13495bb337"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.702034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.702527 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.202512305 +0000 UTC m=+176.715204573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.714898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hsfss" event={"ID":"16733c4e-be2e-4b5a-885c-6d2fab583caf","Type":"ContainerStarted","Data":"da49778ad37d3b8881065b7b22e072be2bc2597059270f67dd56ed980552b273"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.714941 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hsfss" event={"ID":"16733c4e-be2e-4b5a-885c-6d2fab583caf","Type":"ContainerStarted","Data":"1c83ad9bd70baa73fca8e27ba8a5ac979b88c576129bff3ef08ae407df847cbe"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.719096 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" event={"ID":"60855808-2138-47e7-9539-df7c86ebe635","Type":"ContainerStarted","Data":"cdfde385164f4597390f627faf559fa225240fab71ce194039e817bac36f0647"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.719954 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" event={"ID":"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9","Type":"ContainerStarted","Data":"b124ff0b73a1472fe322a0945524b451cc8e2a1245569bb64a312d378dc7eeed"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.734521 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" event={"ID":"61c89047-11ed-4a9e-b861-25e144d42b5e","Type":"ContainerStarted","Data":"50f33e16f3a07f95963910336989746c98e14cfea22ae0548e6d8a1b8795c00a"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.734561 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" event={"ID":"61c89047-11ed-4a9e-b861-25e144d42b5e","Type":"ContainerStarted","Data":"412a3028708ca682ae7fe5e2e866f5350d97ad9b7f32425dadd0aa710d9f05de"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.737091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" event={"ID":"66ba3ebe-86e2-4711-a87d-9505fef09f76","Type":"ContainerStarted","Data":"bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.737267 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.742775 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.744360 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" event={"ID":"7ace8f29-86ea-40a3-8578-276187e6f1a2","Type":"ContainerStarted","Data":"b45e7816c6efe0669841a3a8138ab0f6eaf29aa306bcc8c29982f34b683b16fc"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.758544 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" event={"ID":"93b35280-bd38-4a0c-8192-807ce4f2eb0b","Type":"ContainerStarted","Data":"2b99855e7b7e37762f96f1a83d679591e723aee6ca1d593aa63bfad0fefc7021"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.789193 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" event={"ID":"b616bb02-0180-4af3-922a-6a09d2da3d67","Type":"ContainerStarted","Data":"bffa8c929ce774ec6f1ca009672485532464f900835d0c968e9517fbbbf69c83"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.792136 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4c865" podStartSLOduration=102.792125107 podStartE2EDuration="1m42.792125107s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.790590739 +0000 UTC m=+176.303283027" watchObservedRunningTime="2026-03-14 08:59:30.792125107 +0000 UTC m=+176.304817375" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.792280 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" event={"ID":"e5ed96ac-e78f-424f-9aa0-f85a6f0321db","Type":"ContainerStarted","Data":"8e847b3e85de9c1473b7edd95d6f4858f9b17f7119278df81c99cc5e04c89309"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.796587 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xm4x6" event={"ID":"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6","Type":"ContainerStarted","Data":"9d3ee850a3f5537a76b13d7550f2dcb175cac1bc5b2547875ff477d58dc9fcdb"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.799974 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" event={"ID":"a78cde11-057e-4d0d-986e-bcda5f4964d6","Type":"ContainerStarted","Data":"196141ebcaae726762a94ab6db2cf882053f26a485581cbc0b9f75dadb7d11d8"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.805155 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.805347 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.305320446 +0000 UTC m=+176.818012704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.805801 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.806187 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.306179297 +0000 UTC m=+176.818871565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.816068 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lpkrv" event={"ID":"343e0673-aad7-49c1-91b7-f5fd88579db3","Type":"ContainerStarted","Data":"22d329fb0683237eda0fec36f68b5698990807c0dabdff7ee6eda5cb39b2a21b"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.819003 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.826802 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55774: no serving certificate available for the kubelet" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.827319 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.827352 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.840898 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqmv" podStartSLOduration=102.840881952 podStartE2EDuration="1m42.840881952s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.836899032 +0000 UTC m=+176.349591300" watchObservedRunningTime="2026-03-14 08:59:30.840881952 +0000 UTC m=+176.353574230" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.845150 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d7cv4" event={"ID":"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb","Type":"ContainerStarted","Data":"5b9be616f6e2b1163421a68d6a20a0d46e475884d546958001bcbcdecf4b5df8"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.866948 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" event={"ID":"22f83565-681d-490b-bd27-d21b456c6e25","Type":"ContainerStarted","Data":"351edad8e62d54e768f359ed889d4a9c4f00913c657c3f91bf7eca8cf84fd7b0"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.867591 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.874608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" event={"ID":"cf9fc759-1702-415b-8879-0f5138ad1b46","Type":"ContainerStarted","Data":"e039d07a4845954b1f7326b3d7b01a1100fecb49fc88e0bf118cdbac7e2510a2"} Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.881746 4956 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m4xmw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.881793 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" podUID="22f83565-681d-490b-bd27-d21b456c6e25" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.898301 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55790: no serving certificate available for the kubelet" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.906880 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:30 crc kubenswrapper[4956]: E0314 08:59:30.908293 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.408251648 +0000 UTC m=+176.920943916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.912583 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8qng2" podStartSLOduration=102.912569506 podStartE2EDuration="1m42.912569506s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.886522658 +0000 UTC m=+176.399214926" watchObservedRunningTime="2026-03-14 08:59:30.912569506 +0000 UTC m=+176.425261774" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.956113 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hsfss" podStartSLOduration=102.9560931 podStartE2EDuration="1m42.9560931s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.953858834 +0000 UTC m=+176.466551102" watchObservedRunningTime="2026-03-14 08:59:30.9560931 +0000 UTC m=+176.468785368" Mar 14 08:59:30 crc kubenswrapper[4956]: I0314 08:59:30.957835 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" podStartSLOduration=102.957826623 podStartE2EDuration="1m42.957826623s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.916109354 +0000 UTC m=+176.428801622" watchObservedRunningTime="2026-03-14 08:59:30.957826623 +0000 UTC m=+176.470518891" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:30.994473 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55802: no serving certificate available for the kubelet" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:30.997861 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7wqv" podStartSLOduration=102.9978339 podStartE2EDuration="1m42.9978339s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:30.992371504 +0000 UTC m=+176.505063772" watchObservedRunningTime="2026-03-14 08:59:30.9978339 +0000 UTC m=+176.510526178" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.008416 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.010764 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.510749481 +0000 UTC m=+177.023441749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.085200 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" podStartSLOduration=103.085179355 podStartE2EDuration="1m43.085179355s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.071588467 +0000 UTC m=+176.584280735" watchObservedRunningTime="2026-03-14 08:59:31.085179355 +0000 UTC m=+176.597871623" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.111342 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.111717 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.611700046 +0000 UTC m=+177.124392314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.133037 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55812: no serving certificate available for the kubelet" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.154674 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.156816 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.156979 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.166987 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" podStartSLOduration=102.166970072 podStartE2EDuration="1m42.166970072s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.130067713 +0000 UTC m=+176.642759981" watchObservedRunningTime="2026-03-14 08:59:31.166970072 +0000 UTC m=+176.679662340" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.204835 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pftg4" podStartSLOduration=103.204818965 podStartE2EDuration="1m43.204818965s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.177145846 +0000 UTC m=+176.689838114" watchObservedRunningTime="2026-03-14 08:59:31.204818965 +0000 UTC m=+176.717511223" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.215560 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.216965 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.716949617 +0000 UTC m=+177.229641885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.232289 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-dfkjs" podStartSLOduration=102.232274669 podStartE2EDuration="1m42.232274669s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.204728062 +0000 UTC m=+176.717420330" watchObservedRunningTime="2026-03-14 08:59:31.232274669 +0000 UTC m=+176.744966937" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.233406 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lpkrv" podStartSLOduration=103.233396736 podStartE2EDuration="1m43.233396736s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.230789222 +0000 UTC m=+176.743481500" watchObservedRunningTime="2026-03-14 08:59:31.233396736 +0000 UTC m=+176.746089004" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.320042 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.320783 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.820766422 +0000 UTC m=+177.333458690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.358227 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55814: no serving certificate available for the kubelet" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.424155 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.424723 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:31.924712661 +0000 UTC m=+177.437404919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.525703 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.526062 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.026047655 +0000 UTC m=+177.538739923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.627028 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.627360 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.127345268 +0000 UTC m=+177.640037536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.664411 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" podStartSLOduration=103.664395641 podStartE2EDuration="1m43.664395641s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.293641997 +0000 UTC m=+176.806334265" watchObservedRunningTime="2026-03-14 08:59:31.664395641 +0000 UTC m=+177.177087909" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.667128 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5pqds"] Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.707434 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr"] Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.728685 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.729052 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.229038341 +0000 UTC m=+177.741730609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.776820 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55826: no serving certificate available for the kubelet" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.831339 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.831705 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.331688297 +0000 UTC m=+177.844380565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.889859 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" event={"ID":"61c89047-11ed-4a9e-b861-25e144d42b5e","Type":"ContainerStarted","Data":"6d96b4d590b96acc01dff5eb4973a05953a06169069ed77954b053b7566f573c"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.895999 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" event={"ID":"581a4228-5ca8-400c-ba62-c16f4dddaddf","Type":"ContainerStarted","Data":"d55cb50685697115175c6ce020f1f2fb61d3f494c457e9d4ba33b67cb1055efd"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.896701 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.898311 4956 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-jkv2j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.898344 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" podUID="581a4228-5ca8-400c-ba62-c16f4dddaddf" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.903085 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" event={"ID":"cf9fc759-1702-415b-8879-0f5138ad1b46","Type":"ContainerStarted","Data":"782dd9647a55feebade2cb8c7c6c247745a56b61486e13de71d6e65babaae907"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.906669 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" event={"ID":"e5ed96ac-e78f-424f-9aa0-f85a6f0321db","Type":"ContainerStarted","Data":"24c4dfb396b61d9575961f5d0fd9e8b55a6d37e729efd03bbf573470c43bcca4"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.907365 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.908396 4956 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b4kwx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.908426 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" podUID="e5ed96ac-e78f-424f-9aa0-f85a6f0321db" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.913687 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" event={"ID":"ea621d77-5364-416c-a378-6adf0e89fc30","Type":"ContainerStarted","Data":"596757d2168fc1e33e94643e39e711396a796841bca224d9189d2d63971b7ed1"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.929597 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" event={"ID":"d0d773ac-a40c-471b-a132-9a49a2e315c9","Type":"ContainerStarted","Data":"21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.930261 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.931731 4956 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5pqds container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.931764 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.932183 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:31 crc kubenswrapper[4956]: E0314 08:59:31.932583 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.43256998 +0000 UTC m=+177.945262248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.940839 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" event={"ID":"c30cdb70-a97a-4750-8a55-e4aaa0e44f9b","Type":"ContainerStarted","Data":"837238a28c89bc1f027745d50d0f88658e71634801b231d9129863392c4802a7"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.942017 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jspq2" podStartSLOduration=103.942004265 podStartE2EDuration="1m43.942004265s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.913080134 +0000 UTC m=+177.425772402" watchObservedRunningTime="2026-03-14 08:59:31.942004265 +0000 UTC m=+177.454696533" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.942199 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ksv2k" podStartSLOduration=103.942193639 podStartE2EDuration="1m43.942193639s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.939931003 +0000 UTC m=+177.452623271" watchObservedRunningTime="2026-03-14 08:59:31.942193639 +0000 UTC m=+177.454885907" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.956717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" event={"ID":"4443557c-c15b-435f-80a8-44916ff7c31a","Type":"ContainerStarted","Data":"2bff9372a98c3222e96c733d6b6b26e9a46f7b7a09e7c1d02b3fc3e814f4acf5"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.958891 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" event={"ID":"7ace8f29-86ea-40a3-8578-276187e6f1a2","Type":"ContainerStarted","Data":"cd669cca2541dc931b21512f821e35141bf3487748347e275ba806bb63674fa4"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.959583 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.960743 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" event={"ID":"93b35280-bd38-4a0c-8192-807ce4f2eb0b","Type":"ContainerStarted","Data":"98557043d17385a84ac9b6870cdfa2ccc1c0cdb32ee9833f8c6400e923f47e39"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.961133 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.962204 4956 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vj2ld container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.962241 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" podUID="7ace8f29-86ea-40a3-8578-276187e6f1a2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.963376 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" event={"ID":"b3520644-d9b3-41e1-8293-7bc8529ffbca","Type":"ContainerStarted","Data":"d3765bdc7f4e612f1e4f56a29c37e8558e24f5214dcd27a6ae44cf2f96d6c043"} Mar 14 08:59:31 crc kubenswrapper[4956]: I0314 08:59:31.986219 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" event={"ID":"7f915e7e-5c5a-4b5c-8b46-459a5bddb130","Type":"ContainerStarted","Data":"6a3f27bf19d0973461f0dd92a9dcc8b0bf16596841daffd5ccbd6bc6912086ba"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.000329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" event={"ID":"4708c5fe-cd4d-48ec-9cca-2c6f89c2fd8f","Type":"ContainerStarted","Data":"00d85a7c56b542b4e05d5f668a624823c53e385e7a9a5221451a3ca47c2c9518"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.006134 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" podStartSLOduration=103.006119872 podStartE2EDuration="1m43.006119872s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:31.982876563 +0000 UTC m=+177.495568841" watchObservedRunningTime="2026-03-14 08:59:32.006119872 +0000 UTC m=+177.518812140" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.015554 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" event={"ID":"a78cde11-057e-4d0d-986e-bcda5f4964d6","Type":"ContainerStarted","Data":"51fb483aca117266d29bae4821f6f7a8f931aa5056e622d37dc86db97879a129"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.015931 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" event={"ID":"a78cde11-057e-4d0d-986e-bcda5f4964d6","Type":"ContainerStarted","Data":"a2ad35d7a6f9a6e794d7949a988ab2764c0fe0bc3ae4ee02629146e5f5d56521"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.018221 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" event={"ID":"b827bdf7-bbd1-4b57-8485-09a312e20b43","Type":"ContainerStarted","Data":"5d66ceea36452ed383751a92a99b5bb84fb919ba52bf247280d084aca5d0b325"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.019742 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d7cv4" event={"ID":"8f87de9e-e590-4bb0-8f62-a5f4bf1922cb","Type":"ContainerStarted","Data":"16715ccc1a3653f16e832e60934cebaf8ea9955bf32f0613a530847d22f334d6"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.021138 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" event={"ID":"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e","Type":"ContainerStarted","Data":"e8ef7959509dcd8e16bf0fbeafc42f319f009434e5944d4e5b80dca77cee3fce"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.021820 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.022754 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" event={"ID":"e7fa9107-c6f8-4ecd-bb56-62b7be869ec9","Type":"ContainerStarted","Data":"f19c06f3a52a5e4bb4b2f71598ab545b1eae0f9480b7530f8af862e9b054c7f4"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.023176 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.025475 4956 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2hj4h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.025526 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.025838 4956 patch_prober.go:28] interesting pod/console-operator-58897d9998-d9kk5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.025877 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" podUID="e7fa9107-c6f8-4ecd-bb56-62b7be869ec9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.026203 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xm4x6" event={"ID":"59da700a-d9ae-49ac-a504-b7d9ea7ff0b6","Type":"ContainerStarted","Data":"cbc81f2ca0acd8ee8c0c063796c354ffb748329e9517d7107e2300bcf8737e6d"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.027558 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m5sxz" event={"ID":"93ee50e1-8448-4b8e-89be-bba3e87c0843","Type":"ContainerStarted","Data":"a015e274373299f3e0c4ab8f6d031c453b504d7466a2de2bde108f37e725379a"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.027579 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m5sxz" event={"ID":"93ee50e1-8448-4b8e-89be-bba3e87c0843","Type":"ContainerStarted","Data":"b9c9cbebb38c5c47b83c18bc666d7a5a0633567b2b0bfefbbf7c58daaec24f94"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.027921 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.028946 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-776c5" event={"ID":"9231d09e-fc87-40bf-85fe-c1d16e3ee943","Type":"ContainerStarted","Data":"6015c735bc5a708b7bf61e25e4b49e3deee8bc054f387aae18718d6b1a9b6f58"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.030086 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" event={"ID":"236467b8-ffbf-4e32-ba1c-b188938f8ff5","Type":"ContainerStarted","Data":"c5c902af943fe1e190b196ae8b8c95e8c8c2298446cf4e1611c914d289480ec7"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.031464 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" event={"ID":"1e01187b-7ff4-40bd-b726-33eedbc33f3c","Type":"ContainerStarted","Data":"df7a7648c507fef4831b87f9ffca202754a1d11ae836cd4e050c9229f4d838a4"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.031510 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" event={"ID":"1e01187b-7ff4-40bd-b726-33eedbc33f3c","Type":"ContainerStarted","Data":"c932a8c68b41b7dcfb4f81cd625fae59578d9de45658570715946c15be506669"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.036466 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.037919 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.537902743 +0000 UTC m=+178.050595011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.040857 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" event={"ID":"c0e5952a-041e-4372-b077-b13baabaabd0","Type":"ContainerStarted","Data":"82e73eeb2bd7c17c16dec7108b8d21c7b16c8a0dbdc3ece1b5df050b2983cc4a"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.040899 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" event={"ID":"c0e5952a-041e-4372-b077-b13baabaabd0","Type":"ContainerStarted","Data":"4272141466ab66584929841ed0e5d2c7fed70590aed36db90831a8f03beaf9c4"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.061933 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" event={"ID":"966098d4-f89c-4deb-a6e9-b1fae3316324","Type":"ContainerStarted","Data":"df50dc8aaa413f78f87c19d1ad0fc6b82304a343258a9bc21d7306a052914839"} Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.076682 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.076770 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.094728 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c8m6n" podStartSLOduration=104.094707848 podStartE2EDuration="1m44.094707848s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.078103064 +0000 UTC m=+177.590795352" watchObservedRunningTime="2026-03-14 08:59:32.094707848 +0000 UTC m=+177.607400116" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.096950 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" podStartSLOduration=103.096943034 podStartE2EDuration="1m43.096943034s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.005784083 +0000 UTC m=+177.518476351" watchObservedRunningTime="2026-03-14 08:59:32.096943034 +0000 UTC m=+177.609635302" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.138546 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.140455 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.640439907 +0000 UTC m=+178.153132175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.147741 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8hw77" podStartSLOduration=104.147726358 podStartE2EDuration="1m44.147726358s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.146907768 +0000 UTC m=+177.659600066" watchObservedRunningTime="2026-03-14 08:59:32.147726358 +0000 UTC m=+177.660418626" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.147909 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" podStartSLOduration=104.147905153 podStartE2EDuration="1m44.147905153s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.122067099 +0000 UTC m=+177.634759367" watchObservedRunningTime="2026-03-14 08:59:32.147905153 +0000 UTC m=+177.660597421" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.166394 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:32 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:32 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:32 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.166458 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.229284 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" podStartSLOduration=104.229250299 podStartE2EDuration="1m44.229250299s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.201930008 +0000 UTC m=+177.714622276" watchObservedRunningTime="2026-03-14 08:59:32.229250299 +0000 UTC m=+177.741942567" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.243323 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.243764 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.74375127 +0000 UTC m=+178.256443538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.263369 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-j9qjz" podStartSLOduration=104.263347548 podStartE2EDuration="1m44.263347548s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.230732456 +0000 UTC m=+177.743424714" watchObservedRunningTime="2026-03-14 08:59:32.263347548 +0000 UTC m=+177.776039816" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.304725 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" podStartSLOduration=103.304712758 podStartE2EDuration="1m43.304712758s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.303847747 +0000 UTC m=+177.816540015" watchObservedRunningTime="2026-03-14 08:59:32.304712758 +0000 UTC m=+177.817405026" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.305621 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sw2ht" podStartSLOduration=104.305615621 podStartE2EDuration="1m44.305615621s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.26382253 +0000 UTC m=+177.776514798" watchObservedRunningTime="2026-03-14 08:59:32.305615621 +0000 UTC m=+177.818307889" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.343995 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.344281 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.844266523 +0000 UTC m=+178.356958781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.346686 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-st9hs" podStartSLOduration=104.346676293 podStartE2EDuration="1m44.346676293s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.345190196 +0000 UTC m=+177.857882464" watchObservedRunningTime="2026-03-14 08:59:32.346676293 +0000 UTC m=+177.859368561" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.376576 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" podStartSLOduration=103.376562338 podStartE2EDuration="1m43.376562338s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.374461985 +0000 UTC m=+177.887154253" watchObservedRunningTime="2026-03-14 08:59:32.376562338 +0000 UTC m=+177.889254606" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.400737 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tb6lx" podStartSLOduration=104.400724169 podStartE2EDuration="1m44.400724169s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.398605407 +0000 UTC m=+177.911297675" watchObservedRunningTime="2026-03-14 08:59:32.400724169 +0000 UTC m=+177.913416437" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.436194 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" podStartSLOduration=104.436174822 podStartE2EDuration="1m44.436174822s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.435881765 +0000 UTC m=+177.948574033" watchObservedRunningTime="2026-03-14 08:59:32.436174822 +0000 UTC m=+177.948867090" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.445449 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.445847 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:32.945827283 +0000 UTC m=+178.458519551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.484660 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zd8f5" podStartSLOduration=103.484641219 podStartE2EDuration="1m43.484641219s" podCreationTimestamp="2026-03-14 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.462313773 +0000 UTC m=+177.975006041" watchObservedRunningTime="2026-03-14 08:59:32.484641219 +0000 UTC m=+177.997333487" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.485036 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mqkjv" podStartSLOduration=104.485029859 podStartE2EDuration="1m44.485029859s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.483369168 +0000 UTC m=+177.996061426" watchObservedRunningTime="2026-03-14 08:59:32.485029859 +0000 UTC m=+177.997722117" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.491132 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55834: no serving certificate available for the kubelet" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.502744 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.548928 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.549105 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.049081154 +0000 UTC m=+178.561773422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.549815 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.550130 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.05012195 +0000 UTC m=+178.562814218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.556674 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-d7cv4" podStartSLOduration=7.556659003 podStartE2EDuration="7.556659003s" podCreationTimestamp="2026-03-14 08:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.554966271 +0000 UTC m=+178.067658539" watchObservedRunningTime="2026-03-14 08:59:32.556659003 +0000 UTC m=+178.069351271" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.557370 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" podStartSLOduration=104.557364271 podStartE2EDuration="1m44.557364271s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.530933052 +0000 UTC m=+178.043625310" watchObservedRunningTime="2026-03-14 08:59:32.557364271 +0000 UTC m=+178.070056539" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.573040 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s5hxg" podStartSLOduration=104.573022371 podStartE2EDuration="1m44.573022371s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.572051546 +0000 UTC m=+178.084743814" watchObservedRunningTime="2026-03-14 08:59:32.573022371 +0000 UTC m=+178.085714639" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.646449 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nvgzt" podStartSLOduration=104.646429989 podStartE2EDuration="1m44.646429989s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.612069823 +0000 UTC m=+178.124762091" watchObservedRunningTime="2026-03-14 08:59:32.646429989 +0000 UTC m=+178.159122247" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.646701 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xm4x6" podStartSLOduration=7.646696085 podStartE2EDuration="7.646696085s" podCreationTimestamp="2026-03-14 08:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.643695741 +0000 UTC m=+178.156388009" watchObservedRunningTime="2026-03-14 08:59:32.646696085 +0000 UTC m=+178.159388353" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.650741 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.651500 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.151456044 +0000 UTC m=+178.664148312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.676023 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" podStartSLOduration=104.676006355 podStartE2EDuration="1m44.676006355s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.674956109 +0000 UTC m=+178.187648387" watchObservedRunningTime="2026-03-14 08:59:32.676006355 +0000 UTC m=+178.188698623" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.728473 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m5sxz" podStartSLOduration=7.728457362 podStartE2EDuration="7.728457362s" podCreationTimestamp="2026-03-14 08:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.727426906 +0000 UTC m=+178.240119174" watchObservedRunningTime="2026-03-14 08:59:32.728457362 +0000 UTC m=+178.241149630" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.754142 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.754923 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.254912061 +0000 UTC m=+178.767604329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.795544 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-776c5" podStartSLOduration=104.795528142 podStartE2EDuration="1m44.795528142s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:32.753355282 +0000 UTC m=+178.266047550" watchObservedRunningTime="2026-03-14 08:59:32.795528142 +0000 UTC m=+178.308220410" Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.855150 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.855636 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.355617269 +0000 UTC m=+178.868309537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:32 crc kubenswrapper[4956]: I0314 08:59:32.957178 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:32 crc kubenswrapper[4956]: E0314 08:59:32.957604 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.457588718 +0000 UTC m=+178.970280986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.057728 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.057894 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.557864036 +0000 UTC m=+179.070556304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.058015 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.058322 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.558308937 +0000 UTC m=+179.071001205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.058887 4956 generic.go:334] "Generic (PLEG): container finished" podID="ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" containerID="1e6949f7b0a995c2b2b1709357213db06a2b48332e2fcd2e53fa73affdbc1e82" exitCode=0 Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.059162 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" event={"ID":"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a","Type":"ContainerDied","Data":"1e6949f7b0a995c2b2b1709357213db06a2b48332e2fcd2e53fa73affdbc1e82"} Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.060619 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" event={"ID":"60855808-2138-47e7-9539-df7c86ebe635","Type":"ContainerStarted","Data":"26869e63805d5d953aa7417d04f6caea1b8ada156260684c87b2eb10c01e3a84"} Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.063711 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" event={"ID":"966098d4-f89c-4deb-a6e9-b1fae3316324","Type":"ContainerStarted","Data":"b89e8006ff28a42ff5633e406cacface1a79c9e1ea5c79665e19cf89609fb0f7"} Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.065032 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.065093 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.065442 4956 patch_prober.go:28] interesting pod/console-operator-58897d9998-d9kk5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.065494 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" podUID="e7fa9107-c6f8-4ecd-bb56-62b7be869ec9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.065728 4956 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5pqds container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.065793 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.066128 4956 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2hj4h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.066175 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.066431 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerName="controller-manager" containerID="cri-o://21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323" gracePeriod=30 Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.067141 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.067227 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.068665 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" podUID="66ba3ebe-86e2-4711-a87d-9505fef09f76" containerName="route-controller-manager" containerID="cri-o://bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88" gracePeriod=30 Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.082647 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vj2ld" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.086980 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jkv2j" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.098199 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6bzr5" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.157857 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:33 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:33 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:33 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.158195 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.158640 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.158825 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.65880576 +0000 UTC m=+179.171498028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.159646 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.160195 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.660175574 +0000 UTC m=+179.172867842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.232880 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.233208 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.234859 4956 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vcs4p container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.234922 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" podUID="966098d4-f89c-4deb-a6e9-b1fae3316324" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.266977 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.267124 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.767109807 +0000 UTC m=+179.279802075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.267235 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.267570 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.767563818 +0000 UTC m=+179.280256086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.368352 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.368531 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.868504102 +0000 UTC m=+179.381196370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.368776 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.369046 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.869039926 +0000 UTC m=+179.381732194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.469847 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.469951 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.969929959 +0000 UTC m=+179.482622227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.470097 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.470405 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:33.97039616 +0000 UTC m=+179.483088428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.570957 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.571039 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.071021326 +0000 UTC m=+179.583713594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.571523 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.571880 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.071865227 +0000 UTC m=+179.584557495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.576849 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b4kwx" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.672273 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.673073 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.173054277 +0000 UTC m=+179.685746545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.776605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.776883 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.276872163 +0000 UTC m=+179.789564431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.820647 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.837113 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55838: no serving certificate available for the kubelet" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.877752 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.878118 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.378098894 +0000 UTC m=+179.890791162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.928225 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.979453 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh6b2\" (UniqueName: \"kubernetes.io/projected/66ba3ebe-86e2-4711-a87d-9505fef09f76-kube-api-access-qh6b2\") pod \"66ba3ebe-86e2-4711-a87d-9505fef09f76\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.979549 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-config\") pod \"66ba3ebe-86e2-4711-a87d-9505fef09f76\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.979627 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ba3ebe-86e2-4711-a87d-9505fef09f76-serving-cert\") pod \"66ba3ebe-86e2-4711-a87d-9505fef09f76\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.979663 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-client-ca\") pod \"66ba3ebe-86e2-4711-a87d-9505fef09f76\" (UID: \"66ba3ebe-86e2-4711-a87d-9505fef09f76\") " Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.979996 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d"] Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.980400 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.980875 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ba3ebe-86e2-4711-a87d-9505fef09f76" containerName="route-controller-manager" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.980904 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ba3ebe-86e2-4711-a87d-9505fef09f76" containerName="route-controller-manager" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.981132 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ba3ebe-86e2-4711-a87d-9505fef09f76" containerName="route-controller-manager" Mar 14 08:59:33 crc kubenswrapper[4956]: E0314 08:59:33.981315 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.481285234 +0000 UTC m=+179.993977502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.981664 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.983198 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d"] Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.983951 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-client-ca" (OuterVolumeSpecName: "client-ca") pod "66ba3ebe-86e2-4711-a87d-9505fef09f76" (UID: "66ba3ebe-86e2-4711-a87d-9505fef09f76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:33 crc kubenswrapper[4956]: I0314 08:59:33.984000 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-config" (OuterVolumeSpecName: "config") pod "66ba3ebe-86e2-4711-a87d-9505fef09f76" (UID: "66ba3ebe-86e2-4711-a87d-9505fef09f76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.002138 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ba3ebe-86e2-4711-a87d-9505fef09f76-kube-api-access-qh6b2" (OuterVolumeSpecName: "kube-api-access-qh6b2") pod "66ba3ebe-86e2-4711-a87d-9505fef09f76" (UID: "66ba3ebe-86e2-4711-a87d-9505fef09f76"). InnerVolumeSpecName "kube-api-access-qh6b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.013582 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ba3ebe-86e2-4711-a87d-9505fef09f76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "66ba3ebe-86e2-4711-a87d-9505fef09f76" (UID: "66ba3ebe-86e2-4711-a87d-9505fef09f76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.079076 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j2vtm"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.080045 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.080962 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081207 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-config\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081245 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c626386-a00a-419c-9116-f2e19d48807a-serving-cert\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081272 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whn47\" (UniqueName: \"kubernetes.io/projected/9c626386-a00a-419c-9116-f2e19d48807a-kube-api-access-whn47\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081348 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-client-ca\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.081391 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.581352266 +0000 UTC m=+180.094044534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081749 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081761 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh6b2\" (UniqueName: \"kubernetes.io/projected/66ba3ebe-86e2-4711-a87d-9505fef09f76-kube-api-access-qh6b2\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081782 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081808 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ba3ebe-86e2-4711-a87d-9505fef09f76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.081822 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66ba3ebe-86e2-4711-a87d-9505fef09f76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.090829 4956 generic.go:334] "Generic (PLEG): container finished" podID="66ba3ebe-86e2-4711-a87d-9505fef09f76" containerID="bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88" exitCode=0 Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.090938 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" event={"ID":"66ba3ebe-86e2-4711-a87d-9505fef09f76","Type":"ContainerDied","Data":"bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88"} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.090967 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" event={"ID":"66ba3ebe-86e2-4711-a87d-9505fef09f76","Type":"ContainerDied","Data":"c87e0e9d4b2d4e3cc782b4478cf96bd5cf1779098848ddf14813a9d8f11ff693"} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.090984 4956 scope.go:117] "RemoveContainer" containerID="bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.091395 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.091543 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.108725 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2vtm"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.118867 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" event={"ID":"60855808-2138-47e7-9539-df7c86ebe635","Type":"ContainerStarted","Data":"6868302b0ba2d68d283f6f21f7f95dee025d4d48604869aab2e2e67ebf24c509"} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.118908 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" event={"ID":"60855808-2138-47e7-9539-df7c86ebe635","Type":"ContainerStarted","Data":"b48ca45c34c28290da33673697d45f90b744990bb6d46b6b040f4d0dbd8f3066"} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.122807 4956 scope.go:117] "RemoveContainer" containerID="bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.123378 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88\": container with ID starting with bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88 not found: ID does not exist" containerID="bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.123413 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88"} err="failed to get container status \"bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88\": rpc error: code = NotFound desc = could not find container \"bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88\": container with ID starting with bd3b8e15f55383491a3c4a7543fcc821577efb457da555b76c721a71628dad88 not found: ID does not exist" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.126232 4956 generic.go:334] "Generic (PLEG): container finished" podID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerID="21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323" exitCode=0 Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.127435 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.127581 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" event={"ID":"d0d773ac-a40c-471b-a132-9a49a2e315c9","Type":"ContainerDied","Data":"21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323"} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.127615 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5pqds" event={"ID":"d0d773ac-a40c-471b-a132-9a49a2e315c9","Type":"ContainerDied","Data":"a770668774fc6bde9d28c39fc19ffe5fee988109383160d061657bdc34abe559"} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.127633 4956 scope.go:117] "RemoveContainer" containerID="21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.149884 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-whkvt" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.154050 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.176794 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:34 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:34 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:34 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.176844 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.185129 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-d9kk5" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.185586 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29bbt\" (UniqueName: \"kubernetes.io/projected/d0d773ac-a40c-471b-a132-9a49a2e315c9-kube-api-access-29bbt\") pod \"d0d773ac-a40c-471b-a132-9a49a2e315c9\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.185750 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-client-ca\") pod \"d0d773ac-a40c-471b-a132-9a49a2e315c9\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.185960 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-config\") pod \"d0d773ac-a40c-471b-a132-9a49a2e315c9\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186039 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d773ac-a40c-471b-a132-9a49a2e315c9-serving-cert\") pod \"d0d773ac-a40c-471b-a132-9a49a2e315c9\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186110 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-proxy-ca-bundles\") pod \"d0d773ac-a40c-471b-a132-9a49a2e315c9\" (UID: \"d0d773ac-a40c-471b-a132-9a49a2e315c9\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186396 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-config\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186560 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c626386-a00a-419c-9116-f2e19d48807a-serving-cert\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186646 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-catalog-content\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186729 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpwz5\" (UniqueName: \"kubernetes.io/projected/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-kube-api-access-vpwz5\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.186823 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whn47\" (UniqueName: \"kubernetes.io/projected/9c626386-a00a-419c-9116-f2e19d48807a-kube-api-access-whn47\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.187028 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-utilities\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.187195 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.187294 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-client-ca\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.189665 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0d773ac-a40c-471b-a132-9a49a2e315c9" (UID: "d0d773ac-a40c-471b-a132-9a49a2e315c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.190088 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-config" (OuterVolumeSpecName: "config") pod "d0d773ac-a40c-471b-a132-9a49a2e315c9" (UID: "d0d773ac-a40c-471b-a132-9a49a2e315c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.192500 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d773ac-a40c-471b-a132-9a49a2e315c9-kube-api-access-29bbt" (OuterVolumeSpecName: "kube-api-access-29bbt") pod "d0d773ac-a40c-471b-a132-9a49a2e315c9" (UID: "d0d773ac-a40c-471b-a132-9a49a2e315c9"). InnerVolumeSpecName "kube-api-access-29bbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.192636 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d0d773ac-a40c-471b-a132-9a49a2e315c9" (UID: "d0d773ac-a40c-471b-a132-9a49a2e315c9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.195358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-config\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.196798 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.696784941 +0000 UTC m=+180.209477209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.198174 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-client-ca\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.208226 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c626386-a00a-419c-9116-f2e19d48807a-serving-cert\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.214575 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d773ac-a40c-471b-a132-9a49a2e315c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0d773ac-a40c-471b-a132-9a49a2e315c9" (UID: "d0d773ac-a40c-471b-a132-9a49a2e315c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.218088 4956 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.223948 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.230230 4956 scope.go:117] "RemoveContainer" containerID="21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.237380 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whn47\" (UniqueName: \"kubernetes.io/projected/9c626386-a00a-419c-9116-f2e19d48807a-kube-api-access-whn47\") pod \"route-controller-manager-599fc45645-tf88d\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.239139 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7x8jr"] Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.240732 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323\": container with ID starting with 21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323 not found: ID does not exist" containerID="21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.240802 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323"} err="failed to get container status \"21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323\": rpc error: code = NotFound desc = could not find container \"21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323\": container with ID starting with 21628bb448fa16a6377e8b99f3673d10973df26b93314567a21f7fabb60ea323 not found: ID does not exist" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288204 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288661 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-utilities\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288865 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-catalog-content\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288895 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpwz5\" (UniqueName: \"kubernetes.io/projected/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-kube-api-access-vpwz5\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288951 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288961 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0d773ac-a40c-471b-a132-9a49a2e315c9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288969 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288979 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29bbt\" (UniqueName: \"kubernetes.io/projected/d0d773ac-a40c-471b-a132-9a49a2e315c9-kube-api-access-29bbt\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.288988 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0d773ac-a40c-471b-a132-9a49a2e315c9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.292447 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8xw98"] Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.292656 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerName="controller-manager" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.292671 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerName="controller-manager" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.292771 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" containerName="controller-manager" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.293388 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.293977 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.793962061 +0000 UTC m=+180.306654329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.294323 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-utilities\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.296578 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-catalog-content\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.298612 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.332027 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xw98"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.343170 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.356386 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpwz5\" (UniqueName: \"kubernetes.io/projected/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-kube-api-access-vpwz5\") pod \"certified-operators-j2vtm\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.391056 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4vj\" (UniqueName: \"kubernetes.io/projected/6a3b7192-2792-4295-b25d-a22c476cd174-kube-api-access-rb4vj\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.391110 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-utilities\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.391157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.391195 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-catalog-content\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.392936 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:34.892924546 +0000 UTC m=+180.405616814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.416792 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.454033 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.477373 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5pqds"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.480216 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9n4jp"] Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.480776 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" containerName="collect-profiles" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.480804 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" containerName="collect-profiles" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.481160 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" containerName="collect-profiles" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.484976 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.486132 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5pqds"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.499163 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.500332 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:35.000286989 +0000 UTC m=+180.512979257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.502202 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-catalog-content\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.502410 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4vj\" (UniqueName: \"kubernetes.io/projected/6a3b7192-2792-4295-b25d-a22c476cd174-kube-api-access-rb4vj\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.502698 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-utilities\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.502882 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.504533 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-catalog-content\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.504671 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-utilities\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.505249 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:35.005230482 +0000 UTC m=+180.517922750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.505444 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9n4jp"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.544050 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4vj\" (UniqueName: \"kubernetes.io/projected/6a3b7192-2792-4295-b25d-a22c476cd174-kube-api-access-rb4vj\") pod \"community-operators-8xw98\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.603683 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jbrf\" (UniqueName: \"kubernetes.io/projected/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-kube-api-access-5jbrf\") pod \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.603804 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume\") pod \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.604006 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.604037 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume\") pod \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\" (UID: \"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.604216 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-utilities\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.604250 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-catalog-content\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.604273 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvc4\" (UniqueName: \"kubernetes.io/projected/011b2d6b-88b0-4013-9ded-b9845c02dec0-kube-api-access-2pvc4\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.609658 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" (UID: "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.609775 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:35.109755366 +0000 UTC m=+180.622447634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.626954 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-kube-api-access-5jbrf" (OuterVolumeSpecName: "kube-api-access-5jbrf") pod "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" (UID: "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a"). InnerVolumeSpecName "kube-api-access-5jbrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.632416 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" (UID: "ea6e9606-19aa-43f7-8344-ebc9f5c3f31a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.639316 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xw98" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.674864 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dd7hc"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.681455 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.696791 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dd7hc"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705170 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-utilities\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705346 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705459 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-catalog-content\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705594 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvc4\" (UniqueName: \"kubernetes.io/projected/011b2d6b-88b0-4013-9ded-b9845c02dec0-kube-api-access-2pvc4\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705706 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jbrf\" (UniqueName: \"kubernetes.io/projected/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-kube-api-access-5jbrf\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705782 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.705845 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.706546 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-utilities\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.708671 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-catalog-content\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.708957 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:59:35.208945256 +0000 UTC m=+180.721637524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9s4fs" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.728305 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvc4\" (UniqueName: \"kubernetes.io/projected/011b2d6b-88b0-4013-9ded-b9845c02dec0-kube-api-access-2pvc4\") pod \"certified-operators-9n4jp\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.771720 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2vtm"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.809043 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.809256 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-catalog-content\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.809280 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-utilities\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.809327 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vn45\" (UniqueName: \"kubernetes.io/projected/038a2b56-42df-4121-b7b4-bdecf2ccb674-kube-api-access-2vn45\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: E0314 08:59:34.809446 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:59:35.309430738 +0000 UTC m=+180.822123006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.860760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.897156 4956 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-14T08:59:34.218114012Z","Handler":null,"Name":""} Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.900856 4956 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.900890 4956 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.910420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-catalog-content\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.910923 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-utilities\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.910971 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.910994 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vn45\" (UniqueName: \"kubernetes.io/projected/038a2b56-42df-4121-b7b4-bdecf2ccb674-kube-api-access-2vn45\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.911411 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-catalog-content\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.911575 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-utilities\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.915697 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.915727 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.918143 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d"] Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.938711 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vn45\" (UniqueName: \"kubernetes.io/projected/038a2b56-42df-4121-b7b4-bdecf2ccb674-kube-api-access-2vn45\") pod \"community-operators-dd7hc\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:34 crc kubenswrapper[4956]: I0314 08:59:34.971654 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9s4fs\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.010127 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.011856 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.018549 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.021232 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xw98"] Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.128888 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.134372 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9n4jp"] Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.138031 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.147368 4956 generic.go:334] "Generic (PLEG): container finished" podID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerID="534dc332cf3213c3378b40a337a3556a617a27764edd854462063130eba499ca" exitCode=0 Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.147464 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2vtm" event={"ID":"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3","Type":"ContainerDied","Data":"534dc332cf3213c3378b40a337a3556a617a27764edd854462063130eba499ca"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.147511 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2vtm" event={"ID":"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3","Type":"ContainerStarted","Data":"935c6f59c403bcddfc63f94f3761e3deaa2ef9567721c1164f88c2a1330e17f7"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.158306 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:35 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:35 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:35 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.158364 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.162574 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" event={"ID":"9c626386-a00a-419c-9116-f2e19d48807a","Type":"ContainerStarted","Data":"76132cc357a5d9fce86b5060532bead51a1bba74fa70811aca7ad83b217436d5"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.162637 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" event={"ID":"9c626386-a00a-419c-9116-f2e19d48807a","Type":"ContainerStarted","Data":"76124f4aaa2be70970e12f23315a068e7431877debd5fa55dee0272551a3bf73"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.164739 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.169353 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.186954 4956 patch_prober.go:28] interesting pod/route-controller-manager-599fc45645-tf88d container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.187274 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.188813 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" event={"ID":"ea6e9606-19aa-43f7-8344-ebc9f5c3f31a","Type":"ContainerDied","Data":"9786cc6ed033e06cdbd092513e846858193131654d5d7dffb77be1819562314d"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.188839 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9786cc6ed033e06cdbd092513e846858193131654d5d7dffb77be1819562314d" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.188971 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.204572 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" event={"ID":"60855808-2138-47e7-9539-df7c86ebe635","Type":"ContainerStarted","Data":"bf8af5f4788d3a61be2fabec2fc40d760dbdae727dae9af9517c401a84710d4f"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.229677 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ba3ebe-86e2-4711-a87d-9505fef09f76" path="/var/lib/kubelet/pods/66ba3ebe-86e2-4711-a87d-9505fef09f76/volumes" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.246809 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.248643 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d773ac-a40c-471b-a132-9a49a2e315c9" path="/var/lib/kubelet/pods/d0d773ac-a40c-471b-a132-9a49a2e315c9/volumes" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.251035 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xw98" event={"ID":"6a3b7192-2792-4295-b25d-a22c476cd174","Type":"ContainerStarted","Data":"5364f0d2a0b56f993c67694feeea43c64ad9de9b6ae0347dd10211fad9586d54"} Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.257144 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" podStartSLOduration=3.257120928 podStartE2EDuration="3.257120928s" podCreationTimestamp="2026-03-14 08:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:35.206131769 +0000 UTC m=+180.718824067" watchObservedRunningTime="2026-03-14 08:59:35.257120928 +0000 UTC m=+180.769813216" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.336184 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jqf62" podStartSLOduration=10.336164667 podStartE2EDuration="10.336164667s" podCreationTimestamp="2026-03-14 08:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:35.33185323 +0000 UTC m=+180.844545488" watchObservedRunningTime="2026-03-14 08:59:35.336164667 +0000 UTC m=+180.848856925" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.437365 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9s4fs"] Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.538876 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dd7hc"] Mar 14 08:59:35 crc kubenswrapper[4956]: E0314 08:59:35.839226 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6e9606_19aa_43f7_8344_ebc9f5c3f31a.slice\": RecentStats: unable to find data in memory cache]" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.982353 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bd7c99444-gqf6q"] Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.983708 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.987468 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd7c99444-gqf6q"] Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.991375 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.992571 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.992905 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.992944 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.993086 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.993533 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:59:35 crc kubenswrapper[4956]: I0314 08:59:35.996923 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.130789 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-proxy-ca-bundles\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.130950 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-serving-cert\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.130992 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-client-ca\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.131194 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbhzb\" (UniqueName: \"kubernetes.io/projected/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-kube-api-access-hbhzb\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.131244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-config\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.158887 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:36 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:36 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:36 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.158967 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.230718 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" event={"ID":"b74baec9-353b-4ada-a777-a0cedf80aaf8","Type":"ContainerStarted","Data":"f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.231009 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" event={"ID":"b74baec9-353b-4ada-a777-a0cedf80aaf8","Type":"ContainerStarted","Data":"0dbfc91da8aabb7fe8599a4348c632eebe6811962e6622612fa1e0d7df9323df"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.231046 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.232109 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbhzb\" (UniqueName: \"kubernetes.io/projected/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-kube-api-access-hbhzb\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.232147 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-config\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.232179 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-proxy-ca-bundles\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.232198 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-serving-cert\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.232219 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-client-ca\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.233950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-client-ca\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.234567 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-proxy-ca-bundles\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.236156 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-config\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.236510 4956 generic.go:334] "Generic (PLEG): container finished" podID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerID="50ce33726c5740459dfa5300daaa5fac0b357405936761ffa945bcb02b2896dd" exitCode=0 Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.236594 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n4jp" event={"ID":"011b2d6b-88b0-4013-9ded-b9845c02dec0","Type":"ContainerDied","Data":"50ce33726c5740459dfa5300daaa5fac0b357405936761ffa945bcb02b2896dd"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.236638 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n4jp" event={"ID":"011b2d6b-88b0-4013-9ded-b9845c02dec0","Type":"ContainerStarted","Data":"fff431752c2b9f428459ec069526de84d5c47a807741fcf356b08764213e0ca2"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.240499 4956 generic.go:334] "Generic (PLEG): container finished" podID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerID="e85151e03dcebf0e5ca1b063d705ab77ff8d85ae599eb02c834176041690eb6e" exitCode=0 Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.240938 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7hc" event={"ID":"038a2b56-42df-4121-b7b4-bdecf2ccb674","Type":"ContainerDied","Data":"e85151e03dcebf0e5ca1b063d705ab77ff8d85ae599eb02c834176041690eb6e"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.241254 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7hc" event={"ID":"038a2b56-42df-4121-b7b4-bdecf2ccb674","Type":"ContainerStarted","Data":"72851f00ba1dca78d783c02d7f56dfe09e5b4f4ec000ff48810eef4659bb5bec"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.242188 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-serving-cert\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.246550 4956 generic.go:334] "Generic (PLEG): container finished" podID="6a3b7192-2792-4295-b25d-a22c476cd174" containerID="49dc3da3d6778d72cad65f189f9bd13ac251fc0f5f341eab57b33345fe2afd93" exitCode=0 Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.247808 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xw98" event={"ID":"6a3b7192-2792-4295-b25d-a22c476cd174","Type":"ContainerDied","Data":"49dc3da3d6778d72cad65f189f9bd13ac251fc0f5f341eab57b33345fe2afd93"} Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.254204 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.255906 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" podStartSLOduration=108.255865773 podStartE2EDuration="1m48.255865773s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:36.25456843 +0000 UTC m=+181.767260708" watchObservedRunningTime="2026-03-14 08:59:36.255865773 +0000 UTC m=+181.768558041" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.274376 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f4ztd"] Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.275395 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.278538 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.286336 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbhzb\" (UniqueName: \"kubernetes.io/projected/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-kube-api-access-hbhzb\") pod \"controller-manager-bd7c99444-gqf6q\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.305209 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.312300 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4ztd"] Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.334284 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-utilities\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.334374 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rkl\" (UniqueName: \"kubernetes.io/projected/78b22cf8-2118-463f-804e-9890feee4427-kube-api-access-r7rkl\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.334408 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-catalog-content\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.427865 4956 ???:1] "http: TLS handshake error from 192.168.126.11:55852: no serving certificate available for the kubelet" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.436534 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-utilities\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.436611 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rkl\" (UniqueName: \"kubernetes.io/projected/78b22cf8-2118-463f-804e-9890feee4427-kube-api-access-r7rkl\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.436649 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-catalog-content\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.438384 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-catalog-content\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.438901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-utilities\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.464285 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rkl\" (UniqueName: \"kubernetes.io/projected/78b22cf8-2118-463f-804e-9890feee4427-kube-api-access-r7rkl\") pod \"redhat-marketplace-f4ztd\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.582100 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd7c99444-gqf6q"] Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.610137 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.687108 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kh627"] Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.688286 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.696159 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kh627"] Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.843055 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-catalog-content\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.843779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-utilities\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.843804 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lpp\" (UniqueName: \"kubernetes.io/projected/39583bbe-6bbd-4423-b048-61c0dc5d955e-kube-api-access-m2lpp\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.900086 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4ztd"] Mar 14 08:59:36 crc kubenswrapper[4956]: W0314 08:59:36.911981 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b22cf8_2118_463f_804e_9890feee4427.slice/crio-a512f833a9ece83238ad251f8b12089f58b800a74b9d08f510d3c3d8c6a1d4c5 WatchSource:0}: Error finding container a512f833a9ece83238ad251f8b12089f58b800a74b9d08f510d3c3d8c6a1d4c5: Status 404 returned error can't find the container with id a512f833a9ece83238ad251f8b12089f58b800a74b9d08f510d3c3d8c6a1d4c5 Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.945219 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-catalog-content\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.945345 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-utilities\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.945381 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lpp\" (UniqueName: \"kubernetes.io/projected/39583bbe-6bbd-4423-b048-61c0dc5d955e-kube-api-access-m2lpp\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.946773 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-catalog-content\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.947299 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-utilities\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:36 crc kubenswrapper[4956]: I0314 08:59:36.967421 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lpp\" (UniqueName: \"kubernetes.io/projected/39583bbe-6bbd-4423-b048-61c0dc5d955e-kube-api-access-m2lpp\") pod \"redhat-marketplace-kh627\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.020855 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.158623 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:37 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:37 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:37 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.159019 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.278715 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2x6s"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.280523 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.286581 4956 generic.go:334] "Generic (PLEG): container finished" podID="78b22cf8-2118-463f-804e-9890feee4427" containerID="f5cff739814123d5bbda760d0816ca4510d4a8554ce2d646eec4c24af9f8a933" exitCode=0 Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.286709 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4ztd" event={"ID":"78b22cf8-2118-463f-804e-9890feee4427","Type":"ContainerDied","Data":"f5cff739814123d5bbda760d0816ca4510d4a8554ce2d646eec4c24af9f8a933"} Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.286744 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4ztd" event={"ID":"78b22cf8-2118-463f-804e-9890feee4427","Type":"ContainerStarted","Data":"a512f833a9ece83238ad251f8b12089f58b800a74b9d08f510d3c3d8c6a1d4c5"} Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.288018 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.297389 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" event={"ID":"782a28e5-7216-4f1c-aa1e-71a6a7d5044f","Type":"ContainerStarted","Data":"022bb866482c4093b8b4d6bf06868fd669f4930c06d372c66b187689e8273215"} Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.297446 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" event={"ID":"782a28e5-7216-4f1c-aa1e-71a6a7d5044f","Type":"ContainerStarted","Data":"5218ebc666709092f29e59ee31af739221572632dbd3b33960bd18413ed87fc2"} Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.298464 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.312055 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6s"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.329254 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.353121 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-catalog-content\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.353238 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-utilities\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.353356 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xsff\" (UniqueName: \"kubernetes.io/projected/2667e495-4a15-4aa2-8839-e1b66f2ee380-kube-api-access-2xsff\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.354626 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kh627"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.385440 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" podStartSLOduration=5.385411445 podStartE2EDuration="5.385411445s" podCreationTimestamp="2026-03-14 08:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:37.375768305 +0000 UTC m=+182.888460573" watchObservedRunningTime="2026-03-14 08:59:37.385411445 +0000 UTC m=+182.898103713" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.454357 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-utilities\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.454590 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xsff\" (UniqueName: \"kubernetes.io/projected/2667e495-4a15-4aa2-8839-e1b66f2ee380-kube-api-access-2xsff\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.454681 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-catalog-content\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.455109 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-catalog-content\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.456980 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-utilities\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.460777 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.476851 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.476903 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.501621 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.502722 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.534561 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xsff\" (UniqueName: \"kubernetes.io/projected/2667e495-4a15-4aa2-8839-e1b66f2ee380-kube-api-access-2xsff\") pod \"redhat-operators-n2x6s\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.561257 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.561324 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.619991 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.666394 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.666497 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.666961 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.697149 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5glts"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.697972 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.698390 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.710464 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.711286 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.719264 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.719589 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.734320 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.753605 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5glts"] Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.771415 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2brbh\" (UniqueName: \"kubernetes.io/projected/a671bb4b-c176-4930-8b09-c5f1b03e27c9-kube-api-access-2brbh\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.771519 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc199c10-e8a5-409a-9ca5-b99a872682c2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.771563 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc199c10-e8a5-409a-9ca5-b99a872682c2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.771617 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-catalog-content\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.771635 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-utilities\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.870036 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.875754 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-catalog-content\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.876043 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-utilities\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.877232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2brbh\" (UniqueName: \"kubernetes.io/projected/a671bb4b-c176-4930-8b09-c5f1b03e27c9-kube-api-access-2brbh\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.877298 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc199c10-e8a5-409a-9ca5-b99a872682c2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.877546 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc199c10-e8a5-409a-9ca5-b99a872682c2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.878541 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-utilities\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.878611 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc199c10-e8a5-409a-9ca5-b99a872682c2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.879546 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-catalog-content\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.911313 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc199c10-e8a5-409a-9ca5-b99a872682c2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:37 crc kubenswrapper[4956]: I0314 08:59:37.911332 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2brbh\" (UniqueName: \"kubernetes.io/projected/a671bb4b-c176-4930-8b09-c5f1b03e27c9-kube-api-access-2brbh\") pod \"redhat-operators-5glts\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.038520 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.063716 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.157728 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.158936 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.168599 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.168659 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.178693 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.178784 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.179885 4956 patch_prober.go:28] interesting pod/console-f9d7485db-8qng2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.179919 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8qng2" podUID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.189202 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:38 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:38 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:38 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.189259 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.244386 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.256514 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vcs4p" Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.311902 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6s"] Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.312029 4956 generic.go:334] "Generic (PLEG): container finished" podID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerID="d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2" exitCode=0 Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.313391 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kh627" event={"ID":"39583bbe-6bbd-4423-b048-61c0dc5d955e","Type":"ContainerDied","Data":"d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2"} Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.313417 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kh627" event={"ID":"39583bbe-6bbd-4423-b048-61c0dc5d955e","Type":"ContainerStarted","Data":"38451df00baac777b2a92f8e4e3ebf38e4ae42eb09109bb98aa92acf93f169f4"} Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.495149 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 08:59:38 crc kubenswrapper[4956]: W0314 08:59:38.527454 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc5f43a71_8ea3_46f5_a6ce_e1ffaff74a79.slice/crio-8205ed755fe04e64598f0ad23465b67a5244494a530b93955dd59cbf0f50c976 WatchSource:0}: Error finding container 8205ed755fe04e64598f0ad23465b67a5244494a530b93955dd59cbf0f50c976: Status 404 returned error can't find the container with id 8205ed755fe04e64598f0ad23465b67a5244494a530b93955dd59cbf0f50c976 Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.592686 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5glts"] Mar 14 08:59:38 crc kubenswrapper[4956]: I0314 08:59:38.763801 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.155116 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.158997 4956 patch_prober.go:28] interesting pod/router-default-5444994796-hsfss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:59:39 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Mar 14 08:59:39 crc kubenswrapper[4956]: [+]process-running ok Mar 14 08:59:39 crc kubenswrapper[4956]: healthz check failed Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.159030 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hsfss" podUID="16733c4e-be2e-4b5a-885c-6d2fab583caf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.341224 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fc199c10-e8a5-409a-9ca5-b99a872682c2","Type":"ContainerStarted","Data":"4aaab3b3a4540a6eb5effdcded76bb7656dbe93f928f5018fe0f9674825ab859"} Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.359819 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79","Type":"ContainerStarted","Data":"8205ed755fe04e64598f0ad23465b67a5244494a530b93955dd59cbf0f50c976"} Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.372046 4956 generic.go:334] "Generic (PLEG): container finished" podID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerID="674ae2c1339c7b26e1e8d9d05f922112191c27a8b7533e53848f89b2e2196c94" exitCode=0 Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.372270 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerDied","Data":"674ae2c1339c7b26e1e8d9d05f922112191c27a8b7533e53848f89b2e2196c94"} Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.372317 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerStarted","Data":"0a4691427777bb8e20835891c990ab11cc0dd4b2e6d16be845235f4c0c0ff692"} Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.394638 4956 generic.go:334] "Generic (PLEG): container finished" podID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerID="b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7" exitCode=0 Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.395398 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerDied","Data":"b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7"} Mar 14 08:59:39 crc kubenswrapper[4956]: I0314 08:59:39.395517 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerStarted","Data":"7830e7fe174afb0b88753dd348c3242046e4cbfa7c3edb93e277a4e0bda530be"} Mar 14 08:59:40 crc kubenswrapper[4956]: I0314 08:59:40.162994 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:40 crc kubenswrapper[4956]: I0314 08:59:40.165640 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hsfss" Mar 14 08:59:40 crc kubenswrapper[4956]: I0314 08:59:40.422730 4956 generic.go:334] "Generic (PLEG): container finished" podID="fc199c10-e8a5-409a-9ca5-b99a872682c2" containerID="9941943d397fbe20071b491662f2bfda57aa2a92506217f4f2fd5d83a08ca990" exitCode=0 Mar 14 08:59:40 crc kubenswrapper[4956]: I0314 08:59:40.423243 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fc199c10-e8a5-409a-9ca5-b99a872682c2","Type":"ContainerDied","Data":"9941943d397fbe20071b491662f2bfda57aa2a92506217f4f2fd5d83a08ca990"} Mar 14 08:59:40 crc kubenswrapper[4956]: I0314 08:59:40.427698 4956 generic.go:334] "Generic (PLEG): container finished" podID="c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79" containerID="04da7177d6ff6de01cc291ec9a0d06c19c6255213504b468d71be3248f476a01" exitCode=0 Mar 14 08:59:40 crc kubenswrapper[4956]: I0314 08:59:40.427753 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79","Type":"ContainerDied","Data":"04da7177d6ff6de01cc291ec9a0d06c19c6255213504b468d71be3248f476a01"} Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.053338 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m5sxz" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.572294 4956 ???:1] "http: TLS handshake error from 192.168.126.11:42868: no serving certificate available for the kubelet" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.737354 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.886065 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc199c10-e8a5-409a-9ca5-b99a872682c2-kube-api-access\") pod \"fc199c10-e8a5-409a-9ca5-b99a872682c2\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.886145 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc199c10-e8a5-409a-9ca5-b99a872682c2-kubelet-dir\") pod \"fc199c10-e8a5-409a-9ca5-b99a872682c2\" (UID: \"fc199c10-e8a5-409a-9ca5-b99a872682c2\") " Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.886251 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc199c10-e8a5-409a-9ca5-b99a872682c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fc199c10-e8a5-409a-9ca5-b99a872682c2" (UID: "fc199c10-e8a5-409a-9ca5-b99a872682c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.886521 4956 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc199c10-e8a5-409a-9ca5-b99a872682c2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.890434 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.891459 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc199c10-e8a5-409a-9ca5-b99a872682c2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fc199c10-e8a5-409a-9ca5-b99a872682c2" (UID: "fc199c10-e8a5-409a-9ca5-b99a872682c2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:41 crc kubenswrapper[4956]: I0314 08:59:41.987681 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc199c10-e8a5-409a-9ca5-b99a872682c2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.088777 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kubelet-dir\") pod \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.088940 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kube-api-access\") pod \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\" (UID: \"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79\") " Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.089198 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79" (UID: "c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.089416 4956 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.093435 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79" (UID: "c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.190173 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.392883 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.395128 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.406611 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf6ad235-d99c-46a7-8c2d-6fc12fc07c10-metrics-certs\") pod \"network-metrics-daemon-42pn5\" (UID: \"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10\") " pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.451180 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.451180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79","Type":"ContainerDied","Data":"8205ed755fe04e64598f0ad23465b67a5244494a530b93955dd59cbf0f50c976"} Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.451288 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8205ed755fe04e64598f0ad23465b67a5244494a530b93955dd59cbf0f50c976" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.463208 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fc199c10-e8a5-409a-9ca5-b99a872682c2","Type":"ContainerDied","Data":"4aaab3b3a4540a6eb5effdcded76bb7656dbe93f928f5018fe0f9674825ab859"} Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.463246 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aaab3b3a4540a6eb5effdcded76bb7656dbe93f928f5018fe0f9674825ab859" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.463271 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.526323 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 08:59:42 crc kubenswrapper[4956]: I0314 08:59:42.535550 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42pn5" Mar 14 08:59:45 crc kubenswrapper[4956]: I0314 08:59:45.275827 4956 ???:1] "http: TLS handshake error from 192.168.126.11:42872: no serving certificate available for the kubelet" Mar 14 08:59:45 crc kubenswrapper[4956]: E0314 08:59:45.994423 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6e9606_19aa_43f7_8344_ebc9f5c3f31a.slice\": RecentStats: unable to find data in memory cache]" Mar 14 08:59:48 crc kubenswrapper[4956]: I0314 08:59:48.157672 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:48 crc kubenswrapper[4956]: I0314 08:59:48.161996 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 08:59:48 crc kubenswrapper[4956]: I0314 08:59:48.165601 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:48 crc kubenswrapper[4956]: I0314 08:59:48.165620 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-lpkrv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 14 08:59:48 crc kubenswrapper[4956]: I0314 08:59:48.165659 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:48 crc kubenswrapper[4956]: I0314 08:59:48.165666 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lpkrv" podUID="343e0673-aad7-49c1-91b7-f5fd88579db3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.098282 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd7c99444-gqf6q"] Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.098976 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" containerID="cri-o://022bb866482c4093b8b4d6bf06868fd669f4930c06d372c66b187689e8273215" gracePeriod=30 Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.132990 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d"] Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.133237 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" containerID="cri-o://76132cc357a5d9fce86b5060532bead51a1bba74fa70811aca7ad83b217436d5" gracePeriod=30 Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.522572 4956 generic.go:334] "Generic (PLEG): container finished" podID="9c626386-a00a-419c-9116-f2e19d48807a" containerID="76132cc357a5d9fce86b5060532bead51a1bba74fa70811aca7ad83b217436d5" exitCode=0 Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.522658 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" event={"ID":"9c626386-a00a-419c-9116-f2e19d48807a","Type":"ContainerDied","Data":"76132cc357a5d9fce86b5060532bead51a1bba74fa70811aca7ad83b217436d5"} Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.524879 4956 generic.go:334] "Generic (PLEG): container finished" podID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerID="022bb866482c4093b8b4d6bf06868fd669f4930c06d372c66b187689e8273215" exitCode=0 Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.524922 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" event={"ID":"782a28e5-7216-4f1c-aa1e-71a6a7d5044f","Type":"ContainerDied","Data":"022bb866482c4093b8b4d6bf06868fd669f4930c06d372c66b187689e8273215"} Mar 14 08:59:51 crc kubenswrapper[4956]: I0314 08:59:51.836000 4956 ???:1] "http: TLS handshake error from 192.168.126.11:34258: no serving certificate available for the kubelet" Mar 14 08:59:54 crc kubenswrapper[4956]: I0314 08:59:54.346152 4956 patch_prober.go:28] interesting pod/route-controller-manager-599fc45645-tf88d container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 14 08:59:54 crc kubenswrapper[4956]: I0314 08:59:54.346578 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 14 08:59:55 crc kubenswrapper[4956]: I0314 08:59:55.144707 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 08:59:56 crc kubenswrapper[4956]: E0314 08:59:56.129655 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6e9606_19aa_43f7_8344_ebc9f5c3f31a.slice\": RecentStats: unable to find data in memory cache]" Mar 14 08:59:57 crc kubenswrapper[4956]: I0314 08:59:57.305356 4956 patch_prober.go:28] interesting pod/controller-manager-bd7c99444-gqf6q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: i/o timeout (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:59:57 crc kubenswrapper[4956]: I0314 08:59:57.305675 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 14 08:59:58 crc kubenswrapper[4956]: I0314 08:59:58.171898 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lpkrv" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.130503 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557980-v2mp9"] Mar 14 09:00:00 crc kubenswrapper[4956]: E0314 09:00:00.130716 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc199c10-e8a5-409a-9ca5-b99a872682c2" containerName="pruner" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.130729 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc199c10-e8a5-409a-9ca5-b99a872682c2" containerName="pruner" Mar 14 09:00:00 crc kubenswrapper[4956]: E0314 09:00:00.130740 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79" containerName="pruner" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.130746 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79" containerName="pruner" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.130827 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc199c10-e8a5-409a-9ca5-b99a872682c2" containerName="pruner" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.130835 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f43a71-8ea3-46f5-a6ce-e1ffaff74a79" containerName="pruner" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.133325 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.138074 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.138617 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.138917 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.142650 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-v2mp9"] Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.160208 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhpq\" (UniqueName: \"kubernetes.io/projected/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c-kube-api-access-6rhpq\") pod \"auto-csr-approver-29557980-v2mp9\" (UID: \"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c\") " pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.232585 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp"] Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.233375 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.235875 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.236006 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.245897 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp"] Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.261621 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6pzl\" (UniqueName: \"kubernetes.io/projected/35335b0c-87c6-40c0-9362-0252727eebee-kube-api-access-l6pzl\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.261695 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35335b0c-87c6-40c0-9362-0252727eebee-secret-volume\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.261725 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35335b0c-87c6-40c0-9362-0252727eebee-config-volume\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.261800 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhpq\" (UniqueName: \"kubernetes.io/projected/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c-kube-api-access-6rhpq\") pod \"auto-csr-approver-29557980-v2mp9\" (UID: \"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c\") " pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.279034 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhpq\" (UniqueName: \"kubernetes.io/projected/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c-kube-api-access-6rhpq\") pod \"auto-csr-approver-29557980-v2mp9\" (UID: \"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c\") " pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.363406 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35335b0c-87c6-40c0-9362-0252727eebee-secret-volume\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.363527 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35335b0c-87c6-40c0-9362-0252727eebee-config-volume\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.363692 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6pzl\" (UniqueName: \"kubernetes.io/projected/35335b0c-87c6-40c0-9362-0252727eebee-kube-api-access-l6pzl\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.364781 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35335b0c-87c6-40c0-9362-0252727eebee-config-volume\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.448022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35335b0c-87c6-40c0-9362-0252727eebee-secret-volume\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.450796 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6pzl\" (UniqueName: \"kubernetes.io/projected/35335b0c-87c6-40c0-9362-0252727eebee-kube-api-access-l6pzl\") pod \"collect-profiles-29557980-rlrhp\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.456766 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:00 crc kubenswrapper[4956]: I0314 09:00:00.549601 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:05 crc kubenswrapper[4956]: I0314 09:00:05.345274 4956 patch_prober.go:28] interesting pod/route-controller-manager-599fc45645-tf88d container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 09:00:05 crc kubenswrapper[4956]: I0314 09:00:05.345404 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 09:00:06 crc kubenswrapper[4956]: E0314 09:00:06.259198 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6e9606_19aa_43f7_8344_ebc9f5c3f31a.slice\": RecentStats: unable to find data in memory cache]" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.152817 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.152915 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.155394 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.155609 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.166968 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.254456 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.254596 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.256435 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.265802 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.280366 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.281979 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.306388 4956 patch_prober.go:28] interesting pod/controller-manager-bd7c99444-gqf6q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.306507 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.430972 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.439324 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 09:00:07 crc kubenswrapper[4956]: I0314 09:00:07.903771 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 09:00:08 crc kubenswrapper[4956]: I0314 09:00:08.048398 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 09:00:08 crc kubenswrapper[4956]: I0314 09:00:08.412558 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc6w2" Mar 14 09:00:10 crc kubenswrapper[4956]: E0314 09:00:10.067792 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 09:00:10 crc kubenswrapper[4956]: E0314 09:00:10.068158 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vn45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dd7hc_openshift-marketplace(038a2b56-42df-4121-b7b4-bdecf2ccb674): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:10 crc kubenswrapper[4956]: E0314 09:00:10.069434 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dd7hc" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" Mar 14 09:00:10 crc kubenswrapper[4956]: E0314 09:00:10.216559 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 09:00:10 crc kubenswrapper[4956]: E0314 09:00:10.216709 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rb4vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8xw98_openshift-marketplace(6a3b7192-2792-4295-b25d-a22c476cd174): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:10 crc kubenswrapper[4956]: E0314 09:00:10.217933 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8xw98" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" Mar 14 09:00:11 crc kubenswrapper[4956]: E0314 09:00:11.191316 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 09:00:11 crc kubenswrapper[4956]: E0314 09:00:11.192648 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r7rkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f4ztd_openshift-marketplace(78b22cf8-2118-463f-804e-9890feee4427): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:11 crc kubenswrapper[4956]: E0314 09:00:11.194218 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f4ztd" podUID="78b22cf8-2118-463f-804e-9890feee4427" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.453534 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.454399 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.456673 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.457220 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.458036 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.543512 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feea3874-1da5-4b39-b76d-06eea186b678-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.543590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/feea3874-1da5-4b39-b76d-06eea186b678-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.644375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/feea3874-1da5-4b39-b76d-06eea186b678-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.644521 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feea3874-1da5-4b39-b76d-06eea186b678-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.644688 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feea3874-1da5-4b39-b76d-06eea186b678-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.677595 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/feea3874-1da5-4b39-b76d-06eea186b678-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:13 crc kubenswrapper[4956]: I0314 09:00:13.776808 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:15 crc kubenswrapper[4956]: I0314 09:00:15.345057 4956 patch_prober.go:28] interesting pod/route-controller-manager-599fc45645-tf88d container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 09:00:15 crc kubenswrapper[4956]: I0314 09:00:15.346546 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 09:00:16 crc kubenswrapper[4956]: E0314 09:00:16.410866 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6e9606_19aa_43f7_8344_ebc9f5c3f31a.slice\": RecentStats: unable to find data in memory cache]" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.306226 4956 patch_prober.go:28] interesting pod/controller-manager-bd7c99444-gqf6q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.306290 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.647961 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.649125 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.665502 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.698406 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-var-lock\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.698696 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.698799 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.799888 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-var-lock\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.800451 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.800150 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-var-lock\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.800524 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.800582 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.824217 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:17 crc kubenswrapper[4956]: I0314 09:00:17.976265 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:00:18 crc kubenswrapper[4956]: E0314 09:00:18.262723 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dd7hc" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" Mar 14 09:00:18 crc kubenswrapper[4956]: E0314 09:00:18.262891 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8xw98" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" Mar 14 09:00:18 crc kubenswrapper[4956]: E0314 09:00:18.262909 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f4ztd" podUID="78b22cf8-2118-463f-804e-9890feee4427" Mar 14 09:00:18 crc kubenswrapper[4956]: E0314 09:00:18.573019 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 14 09:00:18 crc kubenswrapper[4956]: E0314 09:00:18.573367 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2brbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5glts_openshift-marketplace(a671bb4b-c176-4930-8b09-c5f1b03e27c9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:18 crc kubenswrapper[4956]: E0314 09:00:18.574675 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5glts" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.210555 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5glts" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.385868 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.386078 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.422225 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk"] Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.422444 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.422455 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.422466 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.422472 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.424953 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c626386-a00a-419c-9116-f2e19d48807a" containerName="route-controller-manager" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.424992 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" containerName="controller-manager" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.425441 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.425580 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.425721 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpwz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-j2vtm_openshift-marketplace(7f2f9b72-16af-43fb-9687-fe7cbbc51bb3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.431853 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-j2vtm" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440286 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c626386-a00a-419c-9116-f2e19d48807a-serving-cert\") pod \"9c626386-a00a-419c-9116-f2e19d48807a\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440364 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-client-ca\") pod \"9c626386-a00a-419c-9116-f2e19d48807a\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440414 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-config\") pod \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440454 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-client-ca\") pod \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440511 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-serving-cert\") pod \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440622 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbhzb\" (UniqueName: \"kubernetes.io/projected/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-kube-api-access-hbhzb\") pod \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440719 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-proxy-ca-bundles\") pod \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\" (UID: \"782a28e5-7216-4f1c-aa1e-71a6a7d5044f\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440755 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-config\") pod \"9c626386-a00a-419c-9116-f2e19d48807a\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.440792 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whn47\" (UniqueName: \"kubernetes.io/projected/9c626386-a00a-419c-9116-f2e19d48807a-kube-api-access-whn47\") pod \"9c626386-a00a-419c-9116-f2e19d48807a\" (UID: \"9c626386-a00a-419c-9116-f2e19d48807a\") " Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.441026 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-client-ca\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.441060 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-serving-cert\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.441086 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jrht\" (UniqueName: \"kubernetes.io/projected/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-kube-api-access-5jrht\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.441115 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-config\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.443637 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c626386-a00a-419c-9116-f2e19d48807a" (UID: "9c626386-a00a-419c-9116-f2e19d48807a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.444912 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-config" (OuterVolumeSpecName: "config") pod "782a28e5-7216-4f1c-aa1e-71a6a7d5044f" (UID: "782a28e5-7216-4f1c-aa1e-71a6a7d5044f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.445375 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-client-ca" (OuterVolumeSpecName: "client-ca") pod "782a28e5-7216-4f1c-aa1e-71a6a7d5044f" (UID: "782a28e5-7216-4f1c-aa1e-71a6a7d5044f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.446746 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-config" (OuterVolumeSpecName: "config") pod "9c626386-a00a-419c-9116-f2e19d48807a" (UID: "9c626386-a00a-419c-9116-f2e19d48807a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.448939 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "782a28e5-7216-4f1c-aa1e-71a6a7d5044f" (UID: "782a28e5-7216-4f1c-aa1e-71a6a7d5044f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.450519 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk"] Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.467277 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "782a28e5-7216-4f1c-aa1e-71a6a7d5044f" (UID: "782a28e5-7216-4f1c-aa1e-71a6a7d5044f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.467446 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c626386-a00a-419c-9116-f2e19d48807a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c626386-a00a-419c-9116-f2e19d48807a" (UID: "9c626386-a00a-419c-9116-f2e19d48807a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.470733 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c626386-a00a-419c-9116-f2e19d48807a-kube-api-access-whn47" (OuterVolumeSpecName: "kube-api-access-whn47") pod "9c626386-a00a-419c-9116-f2e19d48807a" (UID: "9c626386-a00a-419c-9116-f2e19d48807a"). InnerVolumeSpecName "kube-api-access-whn47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.473524 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-kube-api-access-hbhzb" (OuterVolumeSpecName: "kube-api-access-hbhzb") pod "782a28e5-7216-4f1c-aa1e-71a6a7d5044f" (UID: "782a28e5-7216-4f1c-aa1e-71a6a7d5044f"). InnerVolumeSpecName "kube-api-access-hbhzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.499157 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-42pn5"] Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.511594 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.512170 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pvc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9n4jp_openshift-marketplace(011b2d6b-88b0-4013-9ded-b9845c02dec0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.513319 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9n4jp" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542738 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-client-ca\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542805 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-serving-cert\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542839 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jrht\" (UniqueName: \"kubernetes.io/projected/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-kube-api-access-5jrht\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542873 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-config\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542930 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542947 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbhzb\" (UniqueName: \"kubernetes.io/projected/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-kube-api-access-hbhzb\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542958 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542969 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542979 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whn47\" (UniqueName: \"kubernetes.io/projected/9c626386-a00a-419c-9116-f2e19d48807a-kube-api-access-whn47\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542989 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c626386-a00a-419c-9116-f2e19d48807a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.542998 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c626386-a00a-419c-9116-f2e19d48807a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.543008 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.543017 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/782a28e5-7216-4f1c-aa1e-71a6a7d5044f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.547669 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-client-ca\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.549901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-config\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.556582 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-serving-cert\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.567855 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jrht\" (UniqueName: \"kubernetes.io/projected/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-kube-api-access-5jrht\") pod \"route-controller-manager-69c847c5c5-q6fjk\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.585807 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.585982 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2lpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kh627_openshift-marketplace(39583bbe-6bbd-4423-b048-61c0dc5d955e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.587173 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kh627" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.688336 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.688354 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d" event={"ID":"9c626386-a00a-419c-9116-f2e19d48807a","Type":"ContainerDied","Data":"76124f4aaa2be70970e12f23315a068e7431877debd5fa55dee0272551a3bf73"} Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.688538 4956 scope.go:117] "RemoveContainer" containerID="76132cc357a5d9fce86b5060532bead51a1bba74fa70811aca7ad83b217436d5" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.689851 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-42pn5" event={"ID":"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10","Type":"ContainerStarted","Data":"6f3ca86a7641d4d3cbf402039ebd1565fbbf3ae7fe49a1140b95e1199ca0f277"} Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.692315 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" event={"ID":"782a28e5-7216-4f1c-aa1e-71a6a7d5044f","Type":"ContainerDied","Data":"5218ebc666709092f29e59ee31af739221572632dbd3b33960bd18413ed87fc2"} Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.692378 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd7c99444-gqf6q" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.694736 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-j2vtm" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.695632 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9n4jp" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.695693 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kh627" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.710548 4956 scope.go:117] "RemoveContainer" containerID="022bb866482c4093b8b4d6bf06868fd669f4930c06d372c66b187689e8273215" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.735827 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.735976 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xsff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n2x6s_openshift-marketplace(2667e495-4a15-4aa2-8839-e1b66f2ee380): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:00:20 crc kubenswrapper[4956]: E0314 09:00:20.737502 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n2x6s" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.782288 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d"] Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.788908 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599fc45645-tf88d"] Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.792065 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd7c99444-gqf6q"] Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.794656 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bd7c99444-gqf6q"] Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.801465 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.836246 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.842855 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8c4ffd1_3ad7_4eef_bbec_b48d5a86ee5c.slice/crio-9ec6a5f82d7f146cb84bd76af02e54237138159d48654c457b5f64cd05f52b00 WatchSource:0}: Error finding container 9ec6a5f82d7f146cb84bd76af02e54237138159d48654c457b5f64cd05f52b00: Status 404 returned error can't find the container with id 9ec6a5f82d7f146cb84bd76af02e54237138159d48654c457b5f64cd05f52b00 Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.844237 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-v2mp9"] Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.845834 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab17437b_05d0_4ae8_8d53_2b7eb384acbe.slice/crio-8ecece98e3473dd204ad1e816247445ea9b04caefa3ed183c8b4a2eb547b5bfd WatchSource:0}: Error finding container 8ecece98e3473dd204ad1e816247445ea9b04caefa3ed183c8b4a2eb547b5bfd: Status 404 returned error can't find the container with id 8ecece98e3473dd204ad1e816247445ea9b04caefa3ed183c8b4a2eb547b5bfd Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.848972 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9bfc6f1be61d2fd74228b04bb11c8e6adb0fd2dc254299edb7f5ffc0e1da58a8 WatchSource:0}: Error finding container 9bfc6f1be61d2fd74228b04bb11c8e6adb0fd2dc254299edb7f5ffc0e1da58a8: Status 404 returned error can't find the container with id 9bfc6f1be61d2fd74228b04bb11c8e6adb0fd2dc254299edb7f5ffc0e1da58a8 Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.931863 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 09:00:20 crc kubenswrapper[4956]: I0314 09:00:20.945714 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp"] Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.967599 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-dfb10028aa1ade384ca6bd99232bb9b092e9a7129b4df7d0b38a897961718cf5 WatchSource:0}: Error finding container dfb10028aa1ade384ca6bd99232bb9b092e9a7129b4df7d0b38a897961718cf5: Status 404 returned error can't find the container with id dfb10028aa1ade384ca6bd99232bb9b092e9a7129b4df7d0b38a897961718cf5 Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.980453 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35335b0c_87c6_40c0_9362_0252727eebee.slice/crio-befed6f2ebfad2fef2bff66b55c285e96d9abaacc066e081b4102b934c19e14f WatchSource:0}: Error finding container befed6f2ebfad2fef2bff66b55c285e96d9abaacc066e081b4102b934c19e14f: Status 404 returned error can't find the container with id befed6f2ebfad2fef2bff66b55c285e96d9abaacc066e081b4102b934c19e14f Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.983190 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d8b4db36153dfcff20e86dc1180cb2c05b2e9e1fd286a2b1722002a4d4cb8687 WatchSource:0}: Error finding container d8b4db36153dfcff20e86dc1180cb2c05b2e9e1fd286a2b1722002a4d4cb8687: Status 404 returned error can't find the container with id d8b4db36153dfcff20e86dc1180cb2c05b2e9e1fd286a2b1722002a4d4cb8687 Mar 14 09:00:20 crc kubenswrapper[4956]: W0314 09:00:20.985027 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfeea3874_1da5_4b39_b76d_06eea186b678.slice/crio-008fd2d9c788e0a82c7fb9e59f0a1be28048899e187ef9ac685b2bbee2ae128e WatchSource:0}: Error finding container 008fd2d9c788e0a82c7fb9e59f0a1be28048899e187ef9ac685b2bbee2ae128e: Status 404 returned error can't find the container with id 008fd2d9c788e0a82c7fb9e59f0a1be28048899e187ef9ac685b2bbee2ae128e Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.044313 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk"] Mar 14 09:00:21 crc kubenswrapper[4956]: W0314 09:00:21.071304 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c775cf_b7e3_4171_83f0_49b6f77a5e51.slice/crio-27ceb125a211061d3a0332bffbfd5a97621182d24154c4a0365151dc83a6ed5b WatchSource:0}: Error finding container 27ceb125a211061d3a0332bffbfd5a97621182d24154c4a0365151dc83a6ed5b: Status 404 returned error can't find the container with id 27ceb125a211061d3a0332bffbfd5a97621182d24154c4a0365151dc83a6ed5b Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.224177 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782a28e5-7216-4f1c-aa1e-71a6a7d5044f" path="/var/lib/kubelet/pods/782a28e5-7216-4f1c-aa1e-71a6a7d5044f/volumes" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.225258 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c626386-a00a-419c-9116-f2e19d48807a" path="/var/lib/kubelet/pods/9c626386-a00a-419c-9116-f2e19d48807a/volumes" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.715926 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab17437b-05d0-4ae8-8d53-2b7eb384acbe","Type":"ContainerStarted","Data":"a6d1ca41b0391bfadf528786ecee02cf9ef21d35ffa087d062488db5e07cf9df"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.716453 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab17437b-05d0-4ae8-8d53-2b7eb384acbe","Type":"ContainerStarted","Data":"8ecece98e3473dd204ad1e816247445ea9b04caefa3ed183c8b4a2eb547b5bfd"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.721655 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6dd4fa0a10e92be78b8b53ad30675611d92bcd3e17210c435ef6ba10b35b06db"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.721696 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9bfc6f1be61d2fd74228b04bb11c8e6adb0fd2dc254299edb7f5ffc0e1da58a8"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.724037 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-42pn5" event={"ID":"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10","Type":"ContainerStarted","Data":"bfcd4bf4f9c72d739175166ddf32873eb1898acc7ff41e526eaae8eb85048d84"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.724063 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-42pn5" event={"ID":"bf6ad235-d99c-46a7-8c2d-6fc12fc07c10","Type":"ContainerStarted","Data":"0ec6cbb7785f410472a4f261e75bd133c8fba9019799276627654aae7c0c50d3"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.726616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"feea3874-1da5-4b39-b76d-06eea186b678","Type":"ContainerStarted","Data":"ac089e50cf3fd8cbff81062dcb09d4ce528e5c689f9855c7ce2fb04302b7246c"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.726642 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"feea3874-1da5-4b39-b76d-06eea186b678","Type":"ContainerStarted","Data":"008fd2d9c788e0a82c7fb9e59f0a1be28048899e187ef9ac685b2bbee2ae128e"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.728883 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"06f26ed135cd4b31eb1aa75e9ec4b15eaadf8c0ef8fb7a92e312e736e8e38854"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.728908 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d8b4db36153dfcff20e86dc1180cb2c05b2e9e1fd286a2b1722002a4d4cb8687"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.729227 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.730908 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" event={"ID":"e3c775cf-b7e3-4171-83f0-49b6f77a5e51","Type":"ContainerStarted","Data":"8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.730933 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" event={"ID":"e3c775cf-b7e3-4171-83f0-49b6f77a5e51","Type":"ContainerStarted","Data":"27ceb125a211061d3a0332bffbfd5a97621182d24154c4a0365151dc83a6ed5b"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.731659 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.736782 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" event={"ID":"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c","Type":"ContainerStarted","Data":"9ec6a5f82d7f146cb84bd76af02e54237138159d48654c457b5f64cd05f52b00"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.739573 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f638dab44df03be40306ad6b4122c9d6fa18f6eb78caf2ea307bb17bfbe1cadd"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.739610 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dfb10028aa1ade384ca6bd99232bb9b092e9a7129b4df7d0b38a897961718cf5"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.741985 4956 generic.go:334] "Generic (PLEG): container finished" podID="35335b0c-87c6-40c0-9362-0252727eebee" containerID="1e8530750c0ec3c127d0b7351bf4a3de4bd6e1db982aeb18acd933b9d4a2b560" exitCode=0 Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.742524 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" event={"ID":"35335b0c-87c6-40c0-9362-0252727eebee","Type":"ContainerDied","Data":"1e8530750c0ec3c127d0b7351bf4a3de4bd6e1db982aeb18acd933b9d4a2b560"} Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.742546 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" event={"ID":"35335b0c-87c6-40c0-9362-0252727eebee","Type":"ContainerStarted","Data":"befed6f2ebfad2fef2bff66b55c285e96d9abaacc066e081b4102b934c19e14f"} Mar 14 09:00:21 crc kubenswrapper[4956]: E0314 09:00:21.748541 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n2x6s" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.748962 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.779187 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.779170979 podStartE2EDuration="4.779170979s" podCreationTimestamp="2026-03-14 09:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:21.75312344 +0000 UTC m=+227.265815708" watchObservedRunningTime="2026-03-14 09:00:21.779170979 +0000 UTC m=+227.291863247" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.779564 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=8.779560569000001 podStartE2EDuration="8.779560569s" podCreationTimestamp="2026-03-14 09:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:21.777029976 +0000 UTC m=+227.289722244" watchObservedRunningTime="2026-03-14 09:00:21.779560569 +0000 UTC m=+227.292252847" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.805179 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-42pn5" podStartSLOduration=153.805147136 podStartE2EDuration="2m33.805147136s" podCreationTimestamp="2026-03-14 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:21.805132826 +0000 UTC m=+227.317825094" watchObservedRunningTime="2026-03-14 09:00:21.805147136 +0000 UTC m=+227.317839404" Mar 14 09:00:21 crc kubenswrapper[4956]: I0314 09:00:21.925992 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" podStartSLOduration=10.925977405 podStartE2EDuration="10.925977405s" podCreationTimestamp="2026-03-14 09:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:21.925037372 +0000 UTC m=+227.437729640" watchObservedRunningTime="2026-03-14 09:00:21.925977405 +0000 UTC m=+227.438669673" Mar 14 09:00:22 crc kubenswrapper[4956]: I0314 09:00:22.752836 4956 generic.go:334] "Generic (PLEG): container finished" podID="feea3874-1da5-4b39-b76d-06eea186b678" containerID="ac089e50cf3fd8cbff81062dcb09d4ce528e5c689f9855c7ce2fb04302b7246c" exitCode=0 Mar 14 09:00:22 crc kubenswrapper[4956]: I0314 09:00:22.752886 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"feea3874-1da5-4b39-b76d-06eea186b678","Type":"ContainerDied","Data":"ac089e50cf3fd8cbff81062dcb09d4ce528e5c689f9855c7ce2fb04302b7246c"} Mar 14 09:00:22 crc kubenswrapper[4956]: I0314 09:00:22.990294 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.012678 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c77f5689b-db2pz"] Mar 14 09:00:23 crc kubenswrapper[4956]: E0314 09:00:23.012958 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35335b0c-87c6-40c0-9362-0252727eebee" containerName="collect-profiles" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.012970 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="35335b0c-87c6-40c0-9362-0252727eebee" containerName="collect-profiles" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.013068 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="35335b0c-87c6-40c0-9362-0252727eebee" containerName="collect-profiles" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.013508 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.015646 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.015945 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.017173 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.017357 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.017808 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.017814 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.025578 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.027187 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c77f5689b-db2pz"] Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095187 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35335b0c-87c6-40c0-9362-0252727eebee-config-volume\") pod \"35335b0c-87c6-40c0-9362-0252727eebee\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095282 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6pzl\" (UniqueName: \"kubernetes.io/projected/35335b0c-87c6-40c0-9362-0252727eebee-kube-api-access-l6pzl\") pod \"35335b0c-87c6-40c0-9362-0252727eebee\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095311 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35335b0c-87c6-40c0-9362-0252727eebee-secret-volume\") pod \"35335b0c-87c6-40c0-9362-0252727eebee\" (UID: \"35335b0c-87c6-40c0-9362-0252727eebee\") " Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095616 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qf62\" (UniqueName: \"kubernetes.io/projected/716b429f-386c-4ef4-9951-500bb511dc6b-kube-api-access-2qf62\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095664 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/716b429f-386c-4ef4-9951-500bb511dc6b-serving-cert\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095691 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-proxy-ca-bundles\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095934 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-config\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.095974 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-client-ca\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.096344 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35335b0c-87c6-40c0-9362-0252727eebee-config-volume" (OuterVolumeSpecName: "config-volume") pod "35335b0c-87c6-40c0-9362-0252727eebee" (UID: "35335b0c-87c6-40c0-9362-0252727eebee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.101554 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35335b0c-87c6-40c0-9362-0252727eebee-kube-api-access-l6pzl" (OuterVolumeSpecName: "kube-api-access-l6pzl") pod "35335b0c-87c6-40c0-9362-0252727eebee" (UID: "35335b0c-87c6-40c0-9362-0252727eebee"). InnerVolumeSpecName "kube-api-access-l6pzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.102593 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35335b0c-87c6-40c0-9362-0252727eebee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35335b0c-87c6-40c0-9362-0252727eebee" (UID: "35335b0c-87c6-40c0-9362-0252727eebee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.197728 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/716b429f-386c-4ef4-9951-500bb511dc6b-serving-cert\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198396 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-proxy-ca-bundles\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198458 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-config\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198519 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-client-ca\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198575 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qf62\" (UniqueName: \"kubernetes.io/projected/716b429f-386c-4ef4-9951-500bb511dc6b-kube-api-access-2qf62\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198638 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35335b0c-87c6-40c0-9362-0252727eebee-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198653 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6pzl\" (UniqueName: \"kubernetes.io/projected/35335b0c-87c6-40c0-9362-0252727eebee-kube-api-access-l6pzl\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.198669 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35335b0c-87c6-40c0-9362-0252727eebee-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.199642 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-proxy-ca-bundles\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.199709 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-client-ca\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.201491 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-config\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.203220 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/716b429f-386c-4ef4-9951-500bb511dc6b-serving-cert\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.214684 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qf62\" (UniqueName: \"kubernetes.io/projected/716b429f-386c-4ef4-9951-500bb511dc6b-kube-api-access-2qf62\") pod \"controller-manager-c77f5689b-db2pz\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.343101 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.761736 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" event={"ID":"35335b0c-87c6-40c0-9362-0252727eebee","Type":"ContainerDied","Data":"befed6f2ebfad2fef2bff66b55c285e96d9abaacc066e081b4102b934c19e14f"} Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.761769 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp" Mar 14 09:00:23 crc kubenswrapper[4956]: I0314 09:00:23.761799 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="befed6f2ebfad2fef2bff66b55c285e96d9abaacc066e081b4102b934c19e14f" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.424125 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.424619 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.674643 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.743342 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feea3874-1da5-4b39-b76d-06eea186b678-kubelet-dir\") pod \"feea3874-1da5-4b39-b76d-06eea186b678\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.743714 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/feea3874-1da5-4b39-b76d-06eea186b678-kube-api-access\") pod \"feea3874-1da5-4b39-b76d-06eea186b678\" (UID: \"feea3874-1da5-4b39-b76d-06eea186b678\") " Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.743540 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/feea3874-1da5-4b39-b76d-06eea186b678-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "feea3874-1da5-4b39-b76d-06eea186b678" (UID: "feea3874-1da5-4b39-b76d-06eea186b678"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.743982 4956 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feea3874-1da5-4b39-b76d-06eea186b678-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.750986 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feea3874-1da5-4b39-b76d-06eea186b678-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "feea3874-1da5-4b39-b76d-06eea186b678" (UID: "feea3874-1da5-4b39-b76d-06eea186b678"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.776379 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"feea3874-1da5-4b39-b76d-06eea186b678","Type":"ContainerDied","Data":"008fd2d9c788e0a82c7fb9e59f0a1be28048899e187ef9ac685b2bbee2ae128e"} Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.776468 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008fd2d9c788e0a82c7fb9e59f0a1be28048899e187ef9ac685b2bbee2ae128e" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.776626 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 09:00:25 crc kubenswrapper[4956]: I0314 09:00:25.845229 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/feea3874-1da5-4b39-b76d-06eea186b678-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:26 crc kubenswrapper[4956]: E0314 09:00:26.518172 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6e9606_19aa_43f7_8344_ebc9f5c3f31a.slice\": RecentStats: unable to find data in memory cache]" Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.090278 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c77f5689b-db2pz"] Mar 14 09:00:27 crc kubenswrapper[4956]: W0314 09:00:27.101208 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716b429f_386c_4ef4_9951_500bb511dc6b.slice/crio-0dd65e65152c7349ef64d150c561d4401821afef403bb4a870679306048670d8 WatchSource:0}: Error finding container 0dd65e65152c7349ef64d150c561d4401821afef403bb4a870679306048670d8: Status 404 returned error can't find the container with id 0dd65e65152c7349ef64d150c561d4401821afef403bb4a870679306048670d8 Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.693272 4956 csr.go:261] certificate signing request csr-trxc9 is approved, waiting to be issued Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.699842 4956 csr.go:257] certificate signing request csr-trxc9 is issued Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.788425 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" event={"ID":"716b429f-386c-4ef4-9951-500bb511dc6b","Type":"ContainerStarted","Data":"898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774"} Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.788472 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" event={"ID":"716b429f-386c-4ef4-9951-500bb511dc6b","Type":"ContainerStarted","Data":"0dd65e65152c7349ef64d150c561d4401821afef403bb4a870679306048670d8"} Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.788680 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.791090 4956 generic.go:334] "Generic (PLEG): container finished" podID="f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c" containerID="b5517deab51d671273b4cea4f69e3a120cc15c2c5c4c6c8297d475a4727b8c01" exitCode=0 Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.791154 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" event={"ID":"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c","Type":"ContainerDied","Data":"b5517deab51d671273b4cea4f69e3a120cc15c2c5c4c6c8297d475a4727b8c01"} Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.794745 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:27 crc kubenswrapper[4956]: I0314 09:00:27.821203 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" podStartSLOduration=16.821180468 podStartE2EDuration="16.821180468s" podCreationTimestamp="2026-03-14 09:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:27.802781769 +0000 UTC m=+233.315474037" watchObservedRunningTime="2026-03-14 09:00:27.821180468 +0000 UTC m=+233.333872736" Mar 14 09:00:28 crc kubenswrapper[4956]: I0314 09:00:28.700803 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-10 07:19:58.251893586 +0000 UTC Mar 14 09:00:28 crc kubenswrapper[4956]: I0314 09:00:28.701053 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6502h19m29.550843976s for next certificate rotation Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.111373 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.190520 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rhpq\" (UniqueName: \"kubernetes.io/projected/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c-kube-api-access-6rhpq\") pod \"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c\" (UID: \"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c\") " Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.196641 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c-kube-api-access-6rhpq" (OuterVolumeSpecName: "kube-api-access-6rhpq") pod "f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c" (UID: "f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c"). InnerVolumeSpecName "kube-api-access-6rhpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.292025 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rhpq\" (UniqueName: \"kubernetes.io/projected/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c-kube-api-access-6rhpq\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.701560 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-28 13:17:34.600525179 +0000 UTC Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.701594 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6220h17m4.89893386s for next certificate rotation Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.804989 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" event={"ID":"f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c","Type":"ContainerDied","Data":"9ec6a5f82d7f146cb84bd76af02e54237138159d48654c457b5f64cd05f52b00"} Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.805039 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-v2mp9" Mar 14 09:00:29 crc kubenswrapper[4956]: I0314 09:00:29.805075 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ec6a5f82d7f146cb84bd76af02e54237138159d48654c457b5f64cd05f52b00" Mar 14 09:00:32 crc kubenswrapper[4956]: I0314 09:00:32.823739 4956 generic.go:334] "Generic (PLEG): container finished" podID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerID="9833e56c303d50e7b34800407218700e070ce0251a60b4f2d6b06144b8c38fa1" exitCode=0 Mar 14 09:00:32 crc kubenswrapper[4956]: I0314 09:00:32.823793 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7hc" event={"ID":"038a2b56-42df-4121-b7b4-bdecf2ccb674","Type":"ContainerDied","Data":"9833e56c303d50e7b34800407218700e070ce0251a60b4f2d6b06144b8c38fa1"} Mar 14 09:00:33 crc kubenswrapper[4956]: I0314 09:00:33.834351 4956 generic.go:334] "Generic (PLEG): container finished" podID="78b22cf8-2118-463f-804e-9890feee4427" containerID="521651c5c3411b55a2394f29664b79a71bbb893c6cc9d66cb2240c0cb72df7cc" exitCode=0 Mar 14 09:00:33 crc kubenswrapper[4956]: I0314 09:00:33.834424 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4ztd" event={"ID":"78b22cf8-2118-463f-804e-9890feee4427","Type":"ContainerDied","Data":"521651c5c3411b55a2394f29664b79a71bbb893c6cc9d66cb2240c0cb72df7cc"} Mar 14 09:00:33 crc kubenswrapper[4956]: I0314 09:00:33.838237 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7hc" event={"ID":"038a2b56-42df-4121-b7b4-bdecf2ccb674","Type":"ContainerStarted","Data":"1fc9aa2d379aa19e85de6c66a2661c0971237cb32bbc94e32687d309e09798da"} Mar 14 09:00:33 crc kubenswrapper[4956]: I0314 09:00:33.840851 4956 generic.go:334] "Generic (PLEG): container finished" podID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerID="2cb746ea276a53ffe403d6c8584658e2c4ea68a0ec683328239e6a79cfd6863b" exitCode=0 Mar 14 09:00:33 crc kubenswrapper[4956]: I0314 09:00:33.840894 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2vtm" event={"ID":"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3","Type":"ContainerDied","Data":"2cb746ea276a53ffe403d6c8584658e2c4ea68a0ec683328239e6a79cfd6863b"} Mar 14 09:00:33 crc kubenswrapper[4956]: I0314 09:00:33.891889 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dd7hc" podStartSLOduration=2.794128518 podStartE2EDuration="59.891866019s" podCreationTimestamp="2026-03-14 08:59:34 +0000 UTC" firstStartedPulling="2026-03-14 08:59:36.244820058 +0000 UTC m=+181.757512326" lastFinishedPulling="2026-03-14 09:00:33.342557559 +0000 UTC m=+238.855249827" observedRunningTime="2026-03-14 09:00:33.888613158 +0000 UTC m=+239.401305436" watchObservedRunningTime="2026-03-14 09:00:33.891866019 +0000 UTC m=+239.404558287" Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.848270 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerStarted","Data":"57ebc0805c02a5b7a5111cbc5b97b015673bdac005be912f580df0488df2cc63"} Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.851403 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4ztd" event={"ID":"78b22cf8-2118-463f-804e-9890feee4427","Type":"ContainerStarted","Data":"08115a7096065a8cb729b565a5b2929affd72c301c045bd48274eac6230d07a8"} Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.854237 4956 generic.go:334] "Generic (PLEG): container finished" podID="6a3b7192-2792-4295-b25d-a22c476cd174" containerID="74cab11720b09ee8e6e9cf5424bd24f1cc1ba5a3fd23a7edd990291aa8a9372b" exitCode=0 Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.854290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xw98" event={"ID":"6a3b7192-2792-4295-b25d-a22c476cd174","Type":"ContainerDied","Data":"74cab11720b09ee8e6e9cf5424bd24f1cc1ba5a3fd23a7edd990291aa8a9372b"} Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.856051 4956 generic.go:334] "Generic (PLEG): container finished" podID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerID="0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1" exitCode=0 Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.856091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kh627" event={"ID":"39583bbe-6bbd-4423-b048-61c0dc5d955e","Type":"ContainerDied","Data":"0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1"} Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.859172 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2vtm" event={"ID":"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3","Type":"ContainerStarted","Data":"c9fa095b85996840c323172eeb91450b73eb689debc81d59a28df6655744e781"} Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.904869 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j2vtm" podStartSLOduration=1.505321319 podStartE2EDuration="1m0.904854568s" podCreationTimestamp="2026-03-14 08:59:34 +0000 UTC" firstStartedPulling="2026-03-14 08:59:35.169142287 +0000 UTC m=+180.681834555" lastFinishedPulling="2026-03-14 09:00:34.568675536 +0000 UTC m=+240.081367804" observedRunningTime="2026-03-14 09:00:34.903033513 +0000 UTC m=+240.415725791" watchObservedRunningTime="2026-03-14 09:00:34.904854568 +0000 UTC m=+240.417546836" Mar 14 09:00:34 crc kubenswrapper[4956]: I0314 09:00:34.921794 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f4ztd" podStartSLOduration=1.744131639 podStartE2EDuration="58.92177927s" podCreationTimestamp="2026-03-14 08:59:36 +0000 UTC" firstStartedPulling="2026-03-14 08:59:37.291996718 +0000 UTC m=+182.804688986" lastFinishedPulling="2026-03-14 09:00:34.469644349 +0000 UTC m=+239.982336617" observedRunningTime="2026-03-14 09:00:34.920149349 +0000 UTC m=+240.432841617" watchObservedRunningTime="2026-03-14 09:00:34.92177927 +0000 UTC m=+240.434471538" Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.011279 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.011343 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.865850 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xw98" event={"ID":"6a3b7192-2792-4295-b25d-a22c476cd174","Type":"ContainerStarted","Data":"2022c86d8eb04405509c2f11141a901012b463aba8b34a9e4e8365477e8f6112"} Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.867988 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kh627" event={"ID":"39583bbe-6bbd-4423-b048-61c0dc5d955e","Type":"ContainerStarted","Data":"a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2"} Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.869840 4956 generic.go:334] "Generic (PLEG): container finished" podID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerID="57ebc0805c02a5b7a5111cbc5b97b015673bdac005be912f580df0488df2cc63" exitCode=0 Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.869876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerDied","Data":"57ebc0805c02a5b7a5111cbc5b97b015673bdac005be912f580df0488df2cc63"} Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.871833 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerStarted","Data":"39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a"} Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.873672 4956 generic.go:334] "Generic (PLEG): container finished" podID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerID="2981cf4b0738d584f35aead635d00db25f59d367a2dfe864aa7761d5bedc7a2d" exitCode=0 Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.873961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n4jp" event={"ID":"011b2d6b-88b0-4013-9ded-b9845c02dec0","Type":"ContainerDied","Data":"2981cf4b0738d584f35aead635d00db25f59d367a2dfe864aa7761d5bedc7a2d"} Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.887040 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8xw98" podStartSLOduration=2.840735928 podStartE2EDuration="1m1.887022939s" podCreationTimestamp="2026-03-14 08:59:34 +0000 UTC" firstStartedPulling="2026-03-14 08:59:36.270791534 +0000 UTC m=+181.783483802" lastFinishedPulling="2026-03-14 09:00:35.317078545 +0000 UTC m=+240.829770813" observedRunningTime="2026-03-14 09:00:35.885419029 +0000 UTC m=+241.398111287" watchObservedRunningTime="2026-03-14 09:00:35.887022939 +0000 UTC m=+241.399715197" Mar 14 09:00:35 crc kubenswrapper[4956]: I0314 09:00:35.930584 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kh627" podStartSLOduration=2.985604436 podStartE2EDuration="59.930567073s" podCreationTimestamp="2026-03-14 08:59:36 +0000 UTC" firstStartedPulling="2026-03-14 08:59:38.319902888 +0000 UTC m=+183.832595156" lastFinishedPulling="2026-03-14 09:00:35.264865525 +0000 UTC m=+240.777557793" observedRunningTime="2026-03-14 09:00:35.929586389 +0000 UTC m=+241.442278677" watchObservedRunningTime="2026-03-14 09:00:35.930567073 +0000 UTC m=+241.443259341" Mar 14 09:00:36 crc kubenswrapper[4956]: I0314 09:00:36.188329 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dd7hc" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="registry-server" probeResult="failure" output=< Mar 14 09:00:36 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:00:36 crc kubenswrapper[4956]: > Mar 14 09:00:36 crc kubenswrapper[4956]: I0314 09:00:36.609795 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 09:00:36 crc kubenswrapper[4956]: I0314 09:00:36.609840 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 09:00:36 crc kubenswrapper[4956]: I0314 09:00:36.881774 4956 generic.go:334] "Generic (PLEG): container finished" podID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerID="39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a" exitCode=0 Mar 14 09:00:36 crc kubenswrapper[4956]: I0314 09:00:36.881820 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerDied","Data":"39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a"} Mar 14 09:00:37 crc kubenswrapper[4956]: I0314 09:00:37.021992 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 09:00:37 crc kubenswrapper[4956]: I0314 09:00:37.022135 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 09:00:37 crc kubenswrapper[4956]: I0314 09:00:37.649793 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-f4ztd" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="registry-server" probeResult="failure" output=< Mar 14 09:00:37 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:00:37 crc kubenswrapper[4956]: > Mar 14 09:00:38 crc kubenswrapper[4956]: I0314 09:00:38.072897 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-kh627" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="registry-server" probeResult="failure" output=< Mar 14 09:00:38 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:00:38 crc kubenswrapper[4956]: > Mar 14 09:00:38 crc kubenswrapper[4956]: I0314 09:00:38.894064 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerStarted","Data":"e48dd7440dc630c42703dac0e9aa62de2970a6e1a7275f3cfffd29dc2410bd47"} Mar 14 09:00:39 crc kubenswrapper[4956]: I0314 09:00:39.900105 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerStarted","Data":"d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05"} Mar 14 09:00:39 crc kubenswrapper[4956]: I0314 09:00:39.902938 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n4jp" event={"ID":"011b2d6b-88b0-4013-9ded-b9845c02dec0","Type":"ContainerStarted","Data":"f54e0175f196e9febf42e9d55997426a693c5b6f9ee8198d998a2a38bc73527c"} Mar 14 09:00:39 crc kubenswrapper[4956]: I0314 09:00:39.917407 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5glts" podStartSLOduration=3.71230334 podStartE2EDuration="1m2.917391076s" podCreationTimestamp="2026-03-14 08:59:37 +0000 UTC" firstStartedPulling="2026-03-14 08:59:39.42015132 +0000 UTC m=+184.932843588" lastFinishedPulling="2026-03-14 09:00:38.625239046 +0000 UTC m=+244.137931324" observedRunningTime="2026-03-14 09:00:39.915883719 +0000 UTC m=+245.428575987" watchObservedRunningTime="2026-03-14 09:00:39.917391076 +0000 UTC m=+245.430083334" Mar 14 09:00:39 crc kubenswrapper[4956]: I0314 09:00:39.935316 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2x6s" podStartSLOduration=3.800155499 podStartE2EDuration="1m2.935299512s" podCreationTimestamp="2026-03-14 08:59:37 +0000 UTC" firstStartedPulling="2026-03-14 08:59:39.375658422 +0000 UTC m=+184.888350690" lastFinishedPulling="2026-03-14 09:00:38.510802435 +0000 UTC m=+244.023494703" observedRunningTime="2026-03-14 09:00:39.932373749 +0000 UTC m=+245.445066017" watchObservedRunningTime="2026-03-14 09:00:39.935299512 +0000 UTC m=+245.447991780" Mar 14 09:00:40 crc kubenswrapper[4956]: I0314 09:00:40.925689 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9n4jp" podStartSLOduration=4.581378289 podStartE2EDuration="1m6.925670098s" podCreationTimestamp="2026-03-14 08:59:34 +0000 UTC" firstStartedPulling="2026-03-14 08:59:36.239791312 +0000 UTC m=+181.752483580" lastFinishedPulling="2026-03-14 09:00:38.584083121 +0000 UTC m=+244.096775389" observedRunningTime="2026-03-14 09:00:40.924649743 +0000 UTC m=+246.437342031" watchObservedRunningTime="2026-03-14 09:00:40.925670098 +0000 UTC m=+246.438362366" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.417439 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.417752 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.618073 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.640236 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8xw98" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.640300 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8xw98" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.673678 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8xw98" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.861356 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.861871 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.896685 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.964202 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8xw98" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.973017 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 09:00:44 crc kubenswrapper[4956]: I0314 09:00:44.980102 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 09:00:45 crc kubenswrapper[4956]: I0314 09:00:45.064713 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 09:00:45 crc kubenswrapper[4956]: I0314 09:00:45.109583 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 09:00:45 crc kubenswrapper[4956]: I0314 09:00:45.844235 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dd7hc"] Mar 14 09:00:46 crc kubenswrapper[4956]: I0314 09:00:46.652596 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 09:00:46 crc kubenswrapper[4956]: I0314 09:00:46.699789 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 09:00:46 crc kubenswrapper[4956]: I0314 09:00:46.938571 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dd7hc" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="registry-server" containerID="cri-o://1fc9aa2d379aa19e85de6c66a2661c0971237cb32bbc94e32687d309e09798da" gracePeriod=2 Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.060901 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.101036 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.243617 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9n4jp"] Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.573812 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m4xmw"] Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.620564 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.620930 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.673127 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.946320 4956 generic.go:334] "Generic (PLEG): container finished" podID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerID="1fc9aa2d379aa19e85de6c66a2661c0971237cb32bbc94e32687d309e09798da" exitCode=0 Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.946380 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7hc" event={"ID":"038a2b56-42df-4121-b7b4-bdecf2ccb674","Type":"ContainerDied","Data":"1fc9aa2d379aa19e85de6c66a2661c0971237cb32bbc94e32687d309e09798da"} Mar 14 09:00:47 crc kubenswrapper[4956]: I0314 09:00:47.946835 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9n4jp" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="registry-server" containerID="cri-o://f54e0175f196e9febf42e9d55997426a693c5b6f9ee8198d998a2a38bc73527c" gracePeriod=2 Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.001915 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.039297 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.039338 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.052757 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.089507 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.138560 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vn45\" (UniqueName: \"kubernetes.io/projected/038a2b56-42df-4121-b7b4-bdecf2ccb674-kube-api-access-2vn45\") pod \"038a2b56-42df-4121-b7b4-bdecf2ccb674\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.138619 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-utilities\") pod \"038a2b56-42df-4121-b7b4-bdecf2ccb674\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.138665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-catalog-content\") pod \"038a2b56-42df-4121-b7b4-bdecf2ccb674\" (UID: \"038a2b56-42df-4121-b7b4-bdecf2ccb674\") " Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.139629 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-utilities" (OuterVolumeSpecName: "utilities") pod "038a2b56-42df-4121-b7b4-bdecf2ccb674" (UID: "038a2b56-42df-4121-b7b4-bdecf2ccb674"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.143628 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038a2b56-42df-4121-b7b4-bdecf2ccb674-kube-api-access-2vn45" (OuterVolumeSpecName: "kube-api-access-2vn45") pod "038a2b56-42df-4121-b7b4-bdecf2ccb674" (UID: "038a2b56-42df-4121-b7b4-bdecf2ccb674"). InnerVolumeSpecName "kube-api-access-2vn45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.188351 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "038a2b56-42df-4121-b7b4-bdecf2ccb674" (UID: "038a2b56-42df-4121-b7b4-bdecf2ccb674"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.239739 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vn45\" (UniqueName: \"kubernetes.io/projected/038a2b56-42df-4121-b7b4-bdecf2ccb674-kube-api-access-2vn45\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.239764 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.239773 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038a2b56-42df-4121-b7b4-bdecf2ccb674-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.955134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7hc" event={"ID":"038a2b56-42df-4121-b7b4-bdecf2ccb674","Type":"ContainerDied","Data":"72851f00ba1dca78d783c02d7f56dfe09e5b4f4ec000ff48810eef4659bb5bec"} Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.955167 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7hc" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.955187 4956 scope.go:117] "RemoveContainer" containerID="1fc9aa2d379aa19e85de6c66a2661c0971237cb32bbc94e32687d309e09798da" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.989666 4956 scope.go:117] "RemoveContainer" containerID="9833e56c303d50e7b34800407218700e070ce0251a60b4f2d6b06144b8c38fa1" Mar 14 09:00:48 crc kubenswrapper[4956]: I0314 09:00:48.994065 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dd7hc"] Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.000638 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dd7hc"] Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.011235 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.030999 4956 scope.go:117] "RemoveContainer" containerID="e85151e03dcebf0e5ca1b063d705ab77ff8d85ae599eb02c834176041690eb6e" Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.217430 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" path="/var/lib/kubelet/pods/038a2b56-42df-4121-b7b4-bdecf2ccb674/volumes" Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.647662 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kh627"] Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.648199 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kh627" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="registry-server" containerID="cri-o://a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2" gracePeriod=2 Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.962164 4956 generic.go:334] "Generic (PLEG): container finished" podID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerID="f54e0175f196e9febf42e9d55997426a693c5b6f9ee8198d998a2a38bc73527c" exitCode=0 Mar 14 09:00:49 crc kubenswrapper[4956]: I0314 09:00:49.962230 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n4jp" event={"ID":"011b2d6b-88b0-4013-9ded-b9845c02dec0","Type":"ContainerDied","Data":"f54e0175f196e9febf42e9d55997426a693c5b6f9ee8198d998a2a38bc73527c"} Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.341796 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.366148 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-catalog-content\") pod \"011b2d6b-88b0-4013-9ded-b9845c02dec0\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.366239 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvc4\" (UniqueName: \"kubernetes.io/projected/011b2d6b-88b0-4013-9ded-b9845c02dec0-kube-api-access-2pvc4\") pod \"011b2d6b-88b0-4013-9ded-b9845c02dec0\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.366267 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-utilities\") pod \"011b2d6b-88b0-4013-9ded-b9845c02dec0\" (UID: \"011b2d6b-88b0-4013-9ded-b9845c02dec0\") " Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.367204 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-utilities" (OuterVolumeSpecName: "utilities") pod "011b2d6b-88b0-4013-9ded-b9845c02dec0" (UID: "011b2d6b-88b0-4013-9ded-b9845c02dec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.371678 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011b2d6b-88b0-4013-9ded-b9845c02dec0-kube-api-access-2pvc4" (OuterVolumeSpecName: "kube-api-access-2pvc4") pod "011b2d6b-88b0-4013-9ded-b9845c02dec0" (UID: "011b2d6b-88b0-4013-9ded-b9845c02dec0"). InnerVolumeSpecName "kube-api-access-2pvc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.407686 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "011b2d6b-88b0-4013-9ded-b9845c02dec0" (UID: "011b2d6b-88b0-4013-9ded-b9845c02dec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.468268 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.468306 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pvc4\" (UniqueName: \"kubernetes.io/projected/011b2d6b-88b0-4013-9ded-b9845c02dec0-kube-api-access-2pvc4\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.468319 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/011b2d6b-88b0-4013-9ded-b9845c02dec0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.864190 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.970418 4956 generic.go:334] "Generic (PLEG): container finished" podID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerID="a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2" exitCode=0 Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.970495 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kh627" event={"ID":"39583bbe-6bbd-4423-b048-61c0dc5d955e","Type":"ContainerDied","Data":"a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2"} Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.970515 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kh627" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.970536 4956 scope.go:117] "RemoveContainer" containerID="a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.970524 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kh627" event={"ID":"39583bbe-6bbd-4423-b048-61c0dc5d955e","Type":"ContainerDied","Data":"38451df00baac777b2a92f8e4e3ebf38e4ae42eb09109bb98aa92acf93f169f4"} Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.972161 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lpp\" (UniqueName: \"kubernetes.io/projected/39583bbe-6bbd-4423-b048-61c0dc5d955e-kube-api-access-m2lpp\") pod \"39583bbe-6bbd-4423-b048-61c0dc5d955e\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.972223 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-catalog-content\") pod \"39583bbe-6bbd-4423-b048-61c0dc5d955e\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.972251 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-utilities\") pod \"39583bbe-6bbd-4423-b048-61c0dc5d955e\" (UID: \"39583bbe-6bbd-4423-b048-61c0dc5d955e\") " Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.972928 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-utilities" (OuterVolumeSpecName: "utilities") pod "39583bbe-6bbd-4423-b048-61c0dc5d955e" (UID: "39583bbe-6bbd-4423-b048-61c0dc5d955e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.977251 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n4jp" event={"ID":"011b2d6b-88b0-4013-9ded-b9845c02dec0","Type":"ContainerDied","Data":"fff431752c2b9f428459ec069526de84d5c47a807741fcf356b08764213e0ca2"} Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.977352 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n4jp" Mar 14 09:00:50 crc kubenswrapper[4956]: I0314 09:00:50.978934 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39583bbe-6bbd-4423-b048-61c0dc5d955e-kube-api-access-m2lpp" (OuterVolumeSpecName: "kube-api-access-m2lpp") pod "39583bbe-6bbd-4423-b048-61c0dc5d955e" (UID: "39583bbe-6bbd-4423-b048-61c0dc5d955e"). InnerVolumeSpecName "kube-api-access-m2lpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.004109 4956 scope.go:117] "RemoveContainer" containerID="0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.006210 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39583bbe-6bbd-4423-b048-61c0dc5d955e" (UID: "39583bbe-6bbd-4423-b048-61c0dc5d955e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.006990 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9n4jp"] Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.011179 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9n4jp"] Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.029421 4956 scope.go:117] "RemoveContainer" containerID="d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.043594 4956 scope.go:117] "RemoveContainer" containerID="a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2" Mar 14 09:00:51 crc kubenswrapper[4956]: E0314 09:00:51.044051 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2\": container with ID starting with a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2 not found: ID does not exist" containerID="a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.044090 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2"} err="failed to get container status \"a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2\": rpc error: code = NotFound desc = could not find container \"a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2\": container with ID starting with a146dca68ff6faee47b870ffd3f52378ae041f0023ee0b429514b75688c03ee2 not found: ID does not exist" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.044115 4956 scope.go:117] "RemoveContainer" containerID="0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1" Mar 14 09:00:51 crc kubenswrapper[4956]: E0314 09:00:51.044399 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1\": container with ID starting with 0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1 not found: ID does not exist" containerID="0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.044431 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1"} err="failed to get container status \"0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1\": rpc error: code = NotFound desc = could not find container \"0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1\": container with ID starting with 0e05a0d2b99ed67da7c6b1d11f470c39603220248023aff3e9d0b054a8e33af1 not found: ID does not exist" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.044448 4956 scope.go:117] "RemoveContainer" containerID="d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2" Mar 14 09:00:51 crc kubenswrapper[4956]: E0314 09:00:51.044672 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2\": container with ID starting with d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2 not found: ID does not exist" containerID="d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.044696 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2"} err="failed to get container status \"d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2\": rpc error: code = NotFound desc = could not find container \"d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2\": container with ID starting with d2fcd66ad8704ecee304f510900f7cf3edf84e3fe5dfbf51ea23980e82f793b2 not found: ID does not exist" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.044712 4956 scope.go:117] "RemoveContainer" containerID="f54e0175f196e9febf42e9d55997426a693c5b6f9ee8198d998a2a38bc73527c" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.063149 4956 scope.go:117] "RemoveContainer" containerID="2981cf4b0738d584f35aead635d00db25f59d367a2dfe864aa7761d5bedc7a2d" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.074342 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lpp\" (UniqueName: \"kubernetes.io/projected/39583bbe-6bbd-4423-b048-61c0dc5d955e-kube-api-access-m2lpp\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.074376 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.074389 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39583bbe-6bbd-4423-b048-61c0dc5d955e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.077268 4956 scope.go:117] "RemoveContainer" containerID="50ce33726c5740459dfa5300daaa5fac0b357405936761ffa945bcb02b2896dd" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.089273 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c77f5689b-db2pz"] Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.089500 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" podUID="716b429f-386c-4ef4-9951-500bb511dc6b" containerName="controller-manager" containerID="cri-o://898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774" gracePeriod=30 Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.189442 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk"] Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.189708 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" podUID="e3c775cf-b7e3-4171-83f0-49b6f77a5e51" containerName="route-controller-manager" containerID="cri-o://8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee" gracePeriod=30 Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.215798 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" path="/var/lib/kubelet/pods/011b2d6b-88b0-4013-9ded-b9845c02dec0/volumes" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.324546 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kh627"] Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.328423 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kh627"] Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.598121 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.603339 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680465 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jrht\" (UniqueName: \"kubernetes.io/projected/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-kube-api-access-5jrht\") pod \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680563 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-client-ca\") pod \"716b429f-386c-4ef4-9951-500bb511dc6b\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680584 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-config\") pod \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680601 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/716b429f-386c-4ef4-9951-500bb511dc6b-serving-cert\") pod \"716b429f-386c-4ef4-9951-500bb511dc6b\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680626 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-proxy-ca-bundles\") pod \"716b429f-386c-4ef4-9951-500bb511dc6b\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680647 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-client-ca\") pod \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680667 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-serving-cert\") pod \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\" (UID: \"e3c775cf-b7e3-4171-83f0-49b6f77a5e51\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680687 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-config\") pod \"716b429f-386c-4ef4-9951-500bb511dc6b\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.680706 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qf62\" (UniqueName: \"kubernetes.io/projected/716b429f-386c-4ef4-9951-500bb511dc6b-kube-api-access-2qf62\") pod \"716b429f-386c-4ef4-9951-500bb511dc6b\" (UID: \"716b429f-386c-4ef4-9951-500bb511dc6b\") " Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.681579 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-client-ca" (OuterVolumeSpecName: "client-ca") pod "e3c775cf-b7e3-4171-83f0-49b6f77a5e51" (UID: "e3c775cf-b7e3-4171-83f0-49b6f77a5e51"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.681630 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "716b429f-386c-4ef4-9951-500bb511dc6b" (UID: "716b429f-386c-4ef4-9951-500bb511dc6b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.681845 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "716b429f-386c-4ef4-9951-500bb511dc6b" (UID: "716b429f-386c-4ef4-9951-500bb511dc6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.682019 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-config" (OuterVolumeSpecName: "config") pod "e3c775cf-b7e3-4171-83f0-49b6f77a5e51" (UID: "e3c775cf-b7e3-4171-83f0-49b6f77a5e51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.682068 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-config" (OuterVolumeSpecName: "config") pod "716b429f-386c-4ef4-9951-500bb511dc6b" (UID: "716b429f-386c-4ef4-9951-500bb511dc6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.683927 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e3c775cf-b7e3-4171-83f0-49b6f77a5e51" (UID: "e3c775cf-b7e3-4171-83f0-49b6f77a5e51"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.683947 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-kube-api-access-5jrht" (OuterVolumeSpecName: "kube-api-access-5jrht") pod "e3c775cf-b7e3-4171-83f0-49b6f77a5e51" (UID: "e3c775cf-b7e3-4171-83f0-49b6f77a5e51"). InnerVolumeSpecName "kube-api-access-5jrht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.683976 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716b429f-386c-4ef4-9951-500bb511dc6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "716b429f-386c-4ef4-9951-500bb511dc6b" (UID: "716b429f-386c-4ef4-9951-500bb511dc6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.684309 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716b429f-386c-4ef4-9951-500bb511dc6b-kube-api-access-2qf62" (OuterVolumeSpecName: "kube-api-access-2qf62") pod "716b429f-386c-4ef4-9951-500bb511dc6b" (UID: "716b429f-386c-4ef4-9951-500bb511dc6b"). InnerVolumeSpecName "kube-api-access-2qf62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781886 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jrht\" (UniqueName: \"kubernetes.io/projected/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-kube-api-access-5jrht\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781918 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781930 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781939 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/716b429f-386c-4ef4-9951-500bb511dc6b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781948 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781956 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781964 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3c775cf-b7e3-4171-83f0-49b6f77a5e51-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781972 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/716b429f-386c-4ef4-9951-500bb511dc6b-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.781980 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qf62\" (UniqueName: \"kubernetes.io/projected/716b429f-386c-4ef4-9951-500bb511dc6b-kube-api-access-2qf62\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.989573 4956 generic.go:334] "Generic (PLEG): container finished" podID="716b429f-386c-4ef4-9951-500bb511dc6b" containerID="898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774" exitCode=0 Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.989616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" event={"ID":"716b429f-386c-4ef4-9951-500bb511dc6b","Type":"ContainerDied","Data":"898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774"} Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.989705 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" event={"ID":"716b429f-386c-4ef4-9951-500bb511dc6b","Type":"ContainerDied","Data":"0dd65e65152c7349ef64d150c561d4401821afef403bb4a870679306048670d8"} Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.989723 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c77f5689b-db2pz" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.989746 4956 scope.go:117] "RemoveContainer" containerID="898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774" Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.992261 4956 generic.go:334] "Generic (PLEG): container finished" podID="e3c775cf-b7e3-4171-83f0-49b6f77a5e51" containerID="8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee" exitCode=0 Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.992301 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" event={"ID":"e3c775cf-b7e3-4171-83f0-49b6f77a5e51","Type":"ContainerDied","Data":"8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee"} Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.992331 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" event={"ID":"e3c775cf-b7e3-4171-83f0-49b6f77a5e51","Type":"ContainerDied","Data":"27ceb125a211061d3a0332bffbfd5a97621182d24154c4a0365151dc83a6ed5b"} Mar 14 09:00:51 crc kubenswrapper[4956]: I0314 09:00:51.992878 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.012267 4956 scope.go:117] "RemoveContainer" containerID="898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774" Mar 14 09:00:52 crc kubenswrapper[4956]: E0314 09:00:52.013057 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774\": container with ID starting with 898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774 not found: ID does not exist" containerID="898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.013151 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774"} err="failed to get container status \"898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774\": rpc error: code = NotFound desc = could not find container \"898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774\": container with ID starting with 898479962cdffb7972905ae9374c42a2f0ca7338105664294eae84355c4f5774 not found: ID does not exist" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.013205 4956 scope.go:117] "RemoveContainer" containerID="8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.024764 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c77f5689b-db2pz"] Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.031780 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c77f5689b-db2pz"] Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.033799 4956 scope.go:117] "RemoveContainer" containerID="8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee" Mar 14 09:00:52 crc kubenswrapper[4956]: E0314 09:00:52.034345 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee\": container with ID starting with 8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee not found: ID does not exist" containerID="8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.034368 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee"} err="failed to get container status \"8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee\": rpc error: code = NotFound desc = could not find container \"8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee\": container with ID starting with 8019b6175bcc0d29415d1bf6a5ab797125e235525c4c6b07c4b63e52209957ee not found: ID does not exist" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.036189 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk"] Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.043376 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c847c5c5-q6fjk"] Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.047721 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5glts"] Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.048196 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5glts" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="registry-server" containerID="cri-o://d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05" gracePeriod=2 Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.377814 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.492598 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-catalog-content\") pod \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.492661 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2brbh\" (UniqueName: \"kubernetes.io/projected/a671bb4b-c176-4930-8b09-c5f1b03e27c9-kube-api-access-2brbh\") pod \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.492772 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-utilities\") pod \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\" (UID: \"a671bb4b-c176-4930-8b09-c5f1b03e27c9\") " Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.494078 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-utilities" (OuterVolumeSpecName: "utilities") pod "a671bb4b-c176-4930-8b09-c5f1b03e27c9" (UID: "a671bb4b-c176-4930-8b09-c5f1b03e27c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.498031 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a671bb4b-c176-4930-8b09-c5f1b03e27c9-kube-api-access-2brbh" (OuterVolumeSpecName: "kube-api-access-2brbh") pod "a671bb4b-c176-4930-8b09-c5f1b03e27c9" (UID: "a671bb4b-c176-4930-8b09-c5f1b03e27c9"). InnerVolumeSpecName "kube-api-access-2brbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.594785 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.594830 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2brbh\" (UniqueName: \"kubernetes.io/projected/a671bb4b-c176-4930-8b09-c5f1b03e27c9-kube-api-access-2brbh\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.611163 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a671bb4b-c176-4930-8b09-c5f1b03e27c9" (UID: "a671bb4b-c176-4930-8b09-c5f1b03e27c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:52 crc kubenswrapper[4956]: I0314 09:00:52.696295 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a671bb4b-c176-4930-8b09-c5f1b03e27c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.003198 4956 generic.go:334] "Generic (PLEG): container finished" podID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerID="d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05" exitCode=0 Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.003325 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5glts" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.003344 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerDied","Data":"d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05"} Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.003453 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5glts" event={"ID":"a671bb4b-c176-4930-8b09-c5f1b03e27c9","Type":"ContainerDied","Data":"7830e7fe174afb0b88753dd348c3242046e4cbfa7c3edb93e277a4e0bda530be"} Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.003533 4956 scope.go:117] "RemoveContainer" containerID="d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.033454 4956 scope.go:117] "RemoveContainer" containerID="39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040214 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f"] Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040615 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feea3874-1da5-4b39-b76d-06eea186b678" containerName="pruner" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040631 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="feea3874-1da5-4b39-b76d-06eea186b678" containerName="pruner" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040644 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040653 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040667 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040676 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040688 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040699 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040714 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040723 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040737 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c" containerName="oc" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040745 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c" containerName="oc" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040758 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040765 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040777 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040787 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040799 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040807 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040820 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716b429f-386c-4ef4-9951-500bb511dc6b" containerName="controller-manager" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040828 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="716b429f-386c-4ef4-9951-500bb511dc6b" containerName="controller-manager" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040840 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040848 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="extract-content" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040859 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040869 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040880 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c775cf-b7e3-4171-83f0-49b6f77a5e51" containerName="route-controller-manager" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040888 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c775cf-b7e3-4171-83f0-49b6f77a5e51" containerName="route-controller-manager" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040900 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040908 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040921 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040929 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.040943 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.040953 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="extract-utilities" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041076 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="716b429f-386c-4ef4-9951-500bb511dc6b" containerName="controller-manager" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041090 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="038a2b56-42df-4121-b7b4-bdecf2ccb674" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041101 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041115 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="011b2d6b-88b0-4013-9ded-b9845c02dec0" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041127 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c775cf-b7e3-4171-83f0-49b6f77a5e51" containerName="route-controller-manager" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041139 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="feea3874-1da5-4b39-b76d-06eea186b678" containerName="pruner" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041151 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c" containerName="oc" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041165 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" containerName="registry-server" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.041660 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.045522 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.046439 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.046878 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.049471 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.049861 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.050589 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7986466d5c-5v6d7"] Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.052070 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.052272 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.057069 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.057512 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.057852 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.058117 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.058285 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.058282 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.061992 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f"] Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.070084 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5glts"] Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.074881 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.076168 4956 scope.go:117] "RemoveContainer" containerID="b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.085385 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5glts"] Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.089767 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7986466d5c-5v6d7"] Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.101723 4956 scope.go:117] "RemoveContainer" containerID="d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102060 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvgg\" (UniqueName: \"kubernetes.io/projected/aec6ab77-fd2d-4f81-8721-7440032c9f24-kube-api-access-7jvgg\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102111 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-config\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102147 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-proxy-ca-bundles\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102171 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-config\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102218 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-client-ca\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102261 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aec6ab77-fd2d-4f81-8721-7440032c9f24-serving-cert\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102291 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-serving-cert\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102324 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2vbc\" (UniqueName: \"kubernetes.io/projected/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-kube-api-access-v2vbc\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102363 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-client-ca\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.102747 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05\": container with ID starting with d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05 not found: ID does not exist" containerID="d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102809 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05"} err="failed to get container status \"d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05\": rpc error: code = NotFound desc = could not find container \"d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05\": container with ID starting with d89a56976a0d23ac0472ec457ea4438d234bde87645de809a60678b1a3742e05 not found: ID does not exist" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.102839 4956 scope.go:117] "RemoveContainer" containerID="39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.104070 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a\": container with ID starting with 39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a not found: ID does not exist" containerID="39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.104282 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a"} err="failed to get container status \"39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a\": rpc error: code = NotFound desc = could not find container \"39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a\": container with ID starting with 39cb070a882089e1b3d0d07b7fd0ceeb38015f84a3ac3c8e465b83e1ff25ed8a not found: ID does not exist" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.104453 4956 scope.go:117] "RemoveContainer" containerID="b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7" Mar 14 09:00:53 crc kubenswrapper[4956]: E0314 09:00:53.105199 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7\": container with ID starting with b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7 not found: ID does not exist" containerID="b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.105405 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7"} err="failed to get container status \"b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7\": rpc error: code = NotFound desc = could not find container \"b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7\": container with ID starting with b2738f92d254f7a9d241d8c2797ffec20057b8f31e7396407d7f3372c0dbf6e7 not found: ID does not exist" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.203260 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aec6ab77-fd2d-4f81-8721-7440032c9f24-serving-cert\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.203510 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-serving-cert\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.203654 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2vbc\" (UniqueName: \"kubernetes.io/projected/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-kube-api-access-v2vbc\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.203782 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-client-ca\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.203880 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvgg\" (UniqueName: \"kubernetes.io/projected/aec6ab77-fd2d-4f81-8721-7440032c9f24-kube-api-access-7jvgg\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.203968 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-config\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.204045 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-config\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.204119 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-proxy-ca-bundles\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.204199 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-client-ca\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.206931 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-client-ca\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.207591 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-client-ca\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.209118 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-config\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.209763 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aec6ab77-fd2d-4f81-8721-7440032c9f24-serving-cert\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.210224 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-serving-cert\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.210836 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-proxy-ca-bundles\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.212948 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-config\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.221208 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39583bbe-6bbd-4423-b048-61c0dc5d955e" path="/var/lib/kubelet/pods/39583bbe-6bbd-4423-b048-61c0dc5d955e/volumes" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.222598 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716b429f-386c-4ef4-9951-500bb511dc6b" path="/var/lib/kubelet/pods/716b429f-386c-4ef4-9951-500bb511dc6b/volumes" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.223475 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a671bb4b-c176-4930-8b09-c5f1b03e27c9" path="/var/lib/kubelet/pods/a671bb4b-c176-4930-8b09-c5f1b03e27c9/volumes" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.225810 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c775cf-b7e3-4171-83f0-49b6f77a5e51" path="/var/lib/kubelet/pods/e3c775cf-b7e3-4171-83f0-49b6f77a5e51/volumes" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.228652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvgg\" (UniqueName: \"kubernetes.io/projected/aec6ab77-fd2d-4f81-8721-7440032c9f24-kube-api-access-7jvgg\") pod \"controller-manager-7986466d5c-5v6d7\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.229698 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2vbc\" (UniqueName: \"kubernetes.io/projected/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-kube-api-access-v2vbc\") pod \"route-controller-manager-59c7c6576c-grw8f\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.390570 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.420559 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.632112 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f"] Mar 14 09:00:53 crc kubenswrapper[4956]: W0314 09:00:53.640886 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c2f0016_a8bc_41d2_8f8d_4f6fbb1bb454.slice/crio-8d705fe1576fd29e9844ba445d7643307939c6b2d08aa7c92ebddbfcd5360d58 WatchSource:0}: Error finding container 8d705fe1576fd29e9844ba445d7643307939c6b2d08aa7c92ebddbfcd5360d58: Status 404 returned error can't find the container with id 8d705fe1576fd29e9844ba445d7643307939c6b2d08aa7c92ebddbfcd5360d58 Mar 14 09:00:53 crc kubenswrapper[4956]: I0314 09:00:53.690670 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7986466d5c-5v6d7"] Mar 14 09:00:53 crc kubenswrapper[4956]: W0314 09:00:53.700031 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaec6ab77_fd2d_4f81_8721_7440032c9f24.slice/crio-fb9903c5bdb4b0a2bb93bb803335931237bee750d112b0884c08b50d65d6afc4 WatchSource:0}: Error finding container fb9903c5bdb4b0a2bb93bb803335931237bee750d112b0884c08b50d65d6afc4: Status 404 returned error can't find the container with id fb9903c5bdb4b0a2bb93bb803335931237bee750d112b0884c08b50d65d6afc4 Mar 14 09:00:54 crc kubenswrapper[4956]: I0314 09:00:54.019725 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" event={"ID":"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454","Type":"ContainerStarted","Data":"8d705fe1576fd29e9844ba445d7643307939c6b2d08aa7c92ebddbfcd5360d58"} Mar 14 09:00:54 crc kubenswrapper[4956]: I0314 09:00:54.024971 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" event={"ID":"aec6ab77-fd2d-4f81-8721-7440032c9f24","Type":"ContainerStarted","Data":"fb9903c5bdb4b0a2bb93bb803335931237bee750d112b0884c08b50d65d6afc4"} Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.030644 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" event={"ID":"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454","Type":"ContainerStarted","Data":"affecb3b2c1ef39c11cbab889919b73267fdb9b6b7d569336f05676141acb51d"} Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.030940 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.032370 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" event={"ID":"aec6ab77-fd2d-4f81-8721-7440032c9f24","Type":"ContainerStarted","Data":"38ef84b9567c4183943626c57109c3afa6f7da31552e7a5ac819292bf95fc7be"} Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.032592 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.036135 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.037518 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.050041 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" podStartSLOduration=4.050021487 podStartE2EDuration="4.050021487s" podCreationTimestamp="2026-03-14 09:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:55.047660156 +0000 UTC m=+260.560352424" watchObservedRunningTime="2026-03-14 09:00:55.050021487 +0000 UTC m=+260.562713755" Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.066994 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" podStartSLOduration=4.066971062 podStartE2EDuration="4.066971062s" podCreationTimestamp="2026-03-14 09:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:55.064318504 +0000 UTC m=+260.577010772" watchObservedRunningTime="2026-03-14 09:00:55.066971062 +0000 UTC m=+260.579663330" Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.424025 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:00:55 crc kubenswrapper[4956]: I0314 09:00:55.424385 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:00:57 crc kubenswrapper[4956]: I0314 09:00:57.437045 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 09:00:58 crc kubenswrapper[4956]: I0314 09:00:58.998521 4956 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000178 4956 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000382 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000715 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e" gracePeriod=15 Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000862 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9" gracePeriod=15 Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000892 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa" gracePeriod=15 Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000881 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d" gracePeriod=15 Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.000975 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1" gracePeriod=15 Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.002762 4956 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003058 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003120 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003731 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003766 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003789 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003803 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003825 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003840 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003854 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003871 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003894 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003908 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003929 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003941 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.003973 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.003986 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.004014 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004028 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004287 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004319 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004337 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004355 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004369 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004387 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004405 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004418 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: E0314 09:00:59.004658 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004675 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.004864 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.088769 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.088835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.088873 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.088902 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.089039 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.089150 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.089192 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.089240 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190272 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190324 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190363 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190394 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190435 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190453 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190452 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190546 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190495 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190601 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190609 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190636 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190680 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190708 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:00:59 crc kubenswrapper[4956]: I0314 09:00:59.190733 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.065325 4956 generic.go:334] "Generic (PLEG): container finished" podID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" containerID="a6d1ca41b0391bfadf528786ecee02cf9ef21d35ffa087d062488db5e07cf9df" exitCode=0 Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.065443 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab17437b-05d0-4ae8-8d53-2b7eb384acbe","Type":"ContainerDied","Data":"a6d1ca41b0391bfadf528786ecee02cf9ef21d35ffa087d062488db5e07cf9df"} Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.067439 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.070564 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.073626 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.074735 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9" exitCode=0 Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.074818 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1" exitCode=0 Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.074840 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa" exitCode=0 Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.074863 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d" exitCode=2 Mar 14 09:01:00 crc kubenswrapper[4956]: I0314 09:01:00.074979 4956 scope.go:117] "RemoveContainer" containerID="f1b4f85faa3f58e5bfed5c0a65d85ebd9f8791d0b717edbdf76e7b38406b0cfa" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.082793 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.378674 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.380239 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.380882 4956 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.381410 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.423338 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.423544 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.423562 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.423577 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.423622 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.423621 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.424098 4956 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.424122 4956 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.424132 4956 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.476335 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.477067 4956 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.477375 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.525366 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kubelet-dir\") pod \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.525380 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab17437b-05d0-4ae8-8d53-2b7eb384acbe" (UID: "ab17437b-05d0-4ae8-8d53-2b7eb384acbe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.525475 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kube-api-access\") pod \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.525661 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-var-lock\") pod \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\" (UID: \"ab17437b-05d0-4ae8-8d53-2b7eb384acbe\") " Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.525839 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-var-lock" (OuterVolumeSpecName: "var-lock") pod "ab17437b-05d0-4ae8-8d53-2b7eb384acbe" (UID: "ab17437b-05d0-4ae8-8d53-2b7eb384acbe"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.526116 4956 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.526141 4956 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.530935 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab17437b-05d0-4ae8-8d53-2b7eb384acbe" (UID: "ab17437b-05d0-4ae8-8d53-2b7eb384acbe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:01 crc kubenswrapper[4956]: I0314 09:01:01.627514 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab17437b-05d0-4ae8-8d53-2b7eb384acbe-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.090016 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab17437b-05d0-4ae8-8d53-2b7eb384acbe","Type":"ContainerDied","Data":"8ecece98e3473dd204ad1e816247445ea9b04caefa3ed183c8b4a2eb547b5bfd"} Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.090060 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ecece98e3473dd204ad1e816247445ea9b04caefa3ed183c8b4a2eb547b5bfd" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.090086 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.093574 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.094375 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e" exitCode=0 Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.094434 4956 scope.go:117] "RemoveContainer" containerID="c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.094491 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.107841 4956 scope.go:117] "RemoveContainer" containerID="dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.117336 4956 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.117546 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.119983 4956 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.120333 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.124585 4956 scope.go:117] "RemoveContainer" containerID="16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.135897 4956 scope.go:117] "RemoveContainer" containerID="fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.149511 4956 scope.go:117] "RemoveContainer" containerID="bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.163808 4956 scope.go:117] "RemoveContainer" containerID="496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.181584 4956 scope.go:117] "RemoveContainer" containerID="c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9" Mar 14 09:01:02 crc kubenswrapper[4956]: E0314 09:01:02.182256 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\": container with ID starting with c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9 not found: ID does not exist" containerID="c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.182292 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9"} err="failed to get container status \"c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\": rpc error: code = NotFound desc = could not find container \"c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9\": container with ID starting with c52c84a296add51f272da4ad81c7565a38871f517d7b25a124255c8136e88ec9 not found: ID does not exist" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.182318 4956 scope.go:117] "RemoveContainer" containerID="dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1" Mar 14 09:01:02 crc kubenswrapper[4956]: E0314 09:01:02.182650 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\": container with ID starting with dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1 not found: ID does not exist" containerID="dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.182680 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1"} err="failed to get container status \"dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\": rpc error: code = NotFound desc = could not find container \"dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1\": container with ID starting with dcc36348270e2d6533d6891bc7ce4e19fc7421925a1f5f0c8e78f8d31c5fabb1 not found: ID does not exist" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.182698 4956 scope.go:117] "RemoveContainer" containerID="16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa" Mar 14 09:01:02 crc kubenswrapper[4956]: E0314 09:01:02.183606 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\": container with ID starting with 16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa not found: ID does not exist" containerID="16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.183683 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa"} err="failed to get container status \"16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\": rpc error: code = NotFound desc = could not find container \"16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa\": container with ID starting with 16a2225ba6a3a97f1eeeb21915164ce9b8fd976718db706f0ebcaf22222af3aa not found: ID does not exist" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.183747 4956 scope.go:117] "RemoveContainer" containerID="fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d" Mar 14 09:01:02 crc kubenswrapper[4956]: E0314 09:01:02.184169 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\": container with ID starting with fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d not found: ID does not exist" containerID="fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.184196 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d"} err="failed to get container status \"fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\": rpc error: code = NotFound desc = could not find container \"fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d\": container with ID starting with fe5412205f2ac9f86b4166f78e0ed5268912a7169fed2f34a5f083ebb510c47d not found: ID does not exist" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.184398 4956 scope.go:117] "RemoveContainer" containerID="bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e" Mar 14 09:01:02 crc kubenswrapper[4956]: E0314 09:01:02.184958 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\": container with ID starting with bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e not found: ID does not exist" containerID="bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.184987 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e"} err="failed to get container status \"bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\": rpc error: code = NotFound desc = could not find container \"bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e\": container with ID starting with bc5af045b8e28803970a07c31ea7a893f8df1ef379783f1ba0bcca4fd4a5372e not found: ID does not exist" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.185016 4956 scope.go:117] "RemoveContainer" containerID="496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b" Mar 14 09:01:02 crc kubenswrapper[4956]: E0314 09:01:02.185268 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\": container with ID starting with 496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b not found: ID does not exist" containerID="496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b" Mar 14 09:01:02 crc kubenswrapper[4956]: I0314 09:01:02.185295 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b"} err="failed to get container status \"496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\": rpc error: code = NotFound desc = could not find container \"496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b\": container with ID starting with 496b680582aa17f89d59ed52f543414b956a76e3019ac13d43ab8eb1b0cf047b not found: ID does not exist" Mar 14 09:01:03 crc kubenswrapper[4956]: I0314 09:01:03.216250 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 14 09:01:04 crc kubenswrapper[4956]: E0314 09:01:04.046869 4956 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:04 crc kubenswrapper[4956]: I0314 09:01:04.047579 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:04 crc kubenswrapper[4956]: E0314 09:01:04.077239 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca9a16df2c465 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 09:01:04.076588133 +0000 UTC m=+269.589280401,LastTimestamp:2026-03-14 09:01:04.076588133 +0000 UTC m=+269.589280401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 09:01:04 crc kubenswrapper[4956]: I0314 09:01:04.112788 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2a454718e9551ee82163c3b91c4c8c071af7e79768aeea36dd0e4f8b6b005167"} Mar 14 09:01:05 crc kubenswrapper[4956]: I0314 09:01:05.121257 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2"} Mar 14 09:01:05 crc kubenswrapper[4956]: I0314 09:01:05.122663 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:05 crc kubenswrapper[4956]: E0314 09:01:05.122870 4956 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:05 crc kubenswrapper[4956]: I0314 09:01:05.212119 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:06 crc kubenswrapper[4956]: E0314 09:01:06.128339 4956 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.398686 4956 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.399499 4956 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.399896 4956 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.400776 4956 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.401055 4956 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:07 crc kubenswrapper[4956]: I0314 09:01:07.401086 4956 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.401471 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Mar 14 09:01:07 crc kubenswrapper[4956]: E0314 09:01:07.602138 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Mar 14 09:01:08 crc kubenswrapper[4956]: E0314 09:01:08.003080 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Mar 14 09:01:08 crc kubenswrapper[4956]: E0314 09:01:08.626074 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca9a16df2c465 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 09:01:04.076588133 +0000 UTC m=+269.589280401,LastTimestamp:2026-03-14 09:01:04.076588133 +0000 UTC m=+269.589280401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 09:01:08 crc kubenswrapper[4956]: E0314 09:01:08.804381 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Mar 14 09:01:10 crc kubenswrapper[4956]: I0314 09:01:10.209300 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:10 crc kubenswrapper[4956]: I0314 09:01:10.210616 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:10 crc kubenswrapper[4956]: I0314 09:01:10.225293 4956 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:10 crc kubenswrapper[4956]: I0314 09:01:10.225429 4956 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:10 crc kubenswrapper[4956]: E0314 09:01:10.225849 4956 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:10 crc kubenswrapper[4956]: I0314 09:01:10.226242 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:10 crc kubenswrapper[4956]: W0314 09:01:10.246261 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-ca6a82f640ad9eae04f06ae1d78422d3a59a7bf7ebf5dc12a1051cc5efa5404e WatchSource:0}: Error finding container ca6a82f640ad9eae04f06ae1d78422d3a59a7bf7ebf5dc12a1051cc5efa5404e: Status 404 returned error can't find the container with id ca6a82f640ad9eae04f06ae1d78422d3a59a7bf7ebf5dc12a1051cc5efa5404e Mar 14 09:01:10 crc kubenswrapper[4956]: E0314 09:01:10.405566 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Mar 14 09:01:11 crc kubenswrapper[4956]: I0314 09:01:11.154885 4956 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ca59c9d4fa20a37315e4b578e3ef678ce610c9ad74fc394cb5eda49051d5afc8" exitCode=0 Mar 14 09:01:11 crc kubenswrapper[4956]: I0314 09:01:11.154937 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ca59c9d4fa20a37315e4b578e3ef678ce610c9ad74fc394cb5eda49051d5afc8"} Mar 14 09:01:11 crc kubenswrapper[4956]: I0314 09:01:11.154972 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ca6a82f640ad9eae04f06ae1d78422d3a59a7bf7ebf5dc12a1051cc5efa5404e"} Mar 14 09:01:11 crc kubenswrapper[4956]: I0314 09:01:11.155261 4956 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:11 crc kubenswrapper[4956]: I0314 09:01:11.155289 4956 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:11 crc kubenswrapper[4956]: E0314 09:01:11.155677 4956 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:11 crc kubenswrapper[4956]: I0314 09:01:11.155703 4956 status_manager.go:851] "Failed to get status for pod" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 14 09:01:12 crc kubenswrapper[4956]: I0314 09:01:12.163567 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aed609a735964948170d94e6167af21d8341d92fddd26dd2a18ee8a7d8640986"} Mar 14 09:01:12 crc kubenswrapper[4956]: I0314 09:01:12.163928 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0a1a5f0a9d3656eebb9cbaf2de656cf4699c4c521da47d6eb61df19e56d6b2f4"} Mar 14 09:01:12 crc kubenswrapper[4956]: I0314 09:01:12.163942 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe87aac630f9c169e3940bb9a3d39616d77620ceef40abed0d7ec72b48593d3d"} Mar 14 09:01:12 crc kubenswrapper[4956]: I0314 09:01:12.601226 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" podUID="22f83565-681d-490b-bd27-d21b456c6e25" containerName="oauth-openshift" containerID="cri-o://351edad8e62d54e768f359ed889d4a9c4f00913c657c3f91bf7eca8cf84fd7b0" gracePeriod=15 Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.171700 4956 generic.go:334] "Generic (PLEG): container finished" podID="22f83565-681d-490b-bd27-d21b456c6e25" containerID="351edad8e62d54e768f359ed889d4a9c4f00913c657c3f91bf7eca8cf84fd7b0" exitCode=0 Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.171771 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" event={"ID":"22f83565-681d-490b-bd27-d21b456c6e25","Type":"ContainerDied","Data":"351edad8e62d54e768f359ed889d4a9c4f00913c657c3f91bf7eca8cf84fd7b0"} Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.174761 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ea7892fdd8f2dcf8044023ddf51c3fb7214a9a4d96b7ea0da327e62911c69f72"} Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.174803 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"daed29ffec03e8a069b49ddc9931dac8adbf4de29ca8ee39bb00e8d1871c132e"} Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.174895 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.174975 4956 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.174997 4956 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.176714 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.177902 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.177940 4956 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="52f963cc29f2d8c086ca6286cfe2cc00d0c427910d03234fca73e6240c9bdcc0" exitCode=1 Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.177963 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"52f963cc29f2d8c086ca6286cfe2cc00d0c427910d03234fca73e6240c9bdcc0"} Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.178361 4956 scope.go:117] "RemoveContainer" containerID="52f963cc29f2d8c086ca6286cfe2cc00d0c427910d03234fca73e6240c9bdcc0" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.738626 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777520 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jltv\" (UniqueName: \"kubernetes.io/projected/22f83565-681d-490b-bd27-d21b456c6e25-kube-api-access-8jltv\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777578 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-login\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777597 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-serving-cert\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777615 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-session\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777634 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-router-certs\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777650 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-trusted-ca-bundle\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777677 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22f83565-681d-490b-bd27-d21b456c6e25-audit-dir\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777696 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-service-ca\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777754 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-cliconfig\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777802 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-audit-policies\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777850 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-provider-selection\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777908 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-idp-0-file-data\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777936 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-error\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777987 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-ocp-branding-template\") pod \"22f83565-681d-490b-bd27-d21b456c6e25\" (UID: \"22f83565-681d-490b-bd27-d21b456c6e25\") " Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.777832 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22f83565-681d-490b-bd27-d21b456c6e25-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.778238 4956 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22f83565-681d-490b-bd27-d21b456c6e25-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.778499 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.778523 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.778464 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.778869 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.783410 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.783808 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.784047 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.784273 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.784414 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.784623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.784821 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.784838 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.785751 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f83565-681d-490b-bd27-d21b456c6e25-kube-api-access-8jltv" (OuterVolumeSpecName: "kube-api-access-8jltv") pod "22f83565-681d-490b-bd27-d21b456c6e25" (UID: "22f83565-681d-490b-bd27-d21b456c6e25"). InnerVolumeSpecName "kube-api-access-8jltv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878896 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878930 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878940 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878951 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jltv\" (UniqueName: \"kubernetes.io/projected/22f83565-681d-490b-bd27-d21b456c6e25-kube-api-access-8jltv\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878963 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878974 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878984 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.878993 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.879001 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.879010 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.879020 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.879028 4956 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/22f83565-681d-490b-bd27-d21b456c6e25-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:13 crc kubenswrapper[4956]: I0314 09:01:13.879037 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/22f83565-681d-490b-bd27-d21b456c6e25-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:14 crc kubenswrapper[4956]: I0314 09:01:14.185997 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 14 09:01:14 crc kubenswrapper[4956]: I0314 09:01:14.187237 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 09:01:14 crc kubenswrapper[4956]: I0314 09:01:14.187339 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"558da465f5ee72e206528e0d6583ac9464b900d73468e5a9d10b761cbe530e38"} Mar 14 09:01:14 crc kubenswrapper[4956]: I0314 09:01:14.188822 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" event={"ID":"22f83565-681d-490b-bd27-d21b456c6e25","Type":"ContainerDied","Data":"b72c1ece8041995c281416a454d56d9bff339390cd8f647c858483557668b2c8"} Mar 14 09:01:14 crc kubenswrapper[4956]: I0314 09:01:14.188880 4956 scope.go:117] "RemoveContainer" containerID="351edad8e62d54e768f359ed889d4a9c4f00913c657c3f91bf7eca8cf84fd7b0" Mar 14 09:01:14 crc kubenswrapper[4956]: I0314 09:01:14.188893 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m4xmw" Mar 14 09:01:15 crc kubenswrapper[4956]: I0314 09:01:15.226836 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:15 crc kubenswrapper[4956]: I0314 09:01:15.227194 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:15 crc kubenswrapper[4956]: I0314 09:01:15.232331 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:15 crc kubenswrapper[4956]: I0314 09:01:15.456889 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 09:01:18 crc kubenswrapper[4956]: I0314 09:01:18.184532 4956 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:18 crc kubenswrapper[4956]: I0314 09:01:18.216422 4956 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:18 crc kubenswrapper[4956]: I0314 09:01:18.216450 4956 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:18 crc kubenswrapper[4956]: I0314 09:01:18.222416 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:18 crc kubenswrapper[4956]: I0314 09:01:18.265224 4956 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f84def7f-7cb2-48db-9df1-29ff914e8e49" Mar 14 09:01:19 crc kubenswrapper[4956]: I0314 09:01:19.220677 4956 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:19 crc kubenswrapper[4956]: I0314 09:01:19.220715 4956 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:19 crc kubenswrapper[4956]: I0314 09:01:19.223306 4956 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f84def7f-7cb2-48db-9df1-29ff914e8e49" Mar 14 09:01:19 crc kubenswrapper[4956]: I0314 09:01:19.369851 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 09:01:19 crc kubenswrapper[4956]: I0314 09:01:19.370204 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 14 09:01:19 crc kubenswrapper[4956]: I0314 09:01:19.370276 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.097001 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.423620 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.424015 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.424104 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.424818 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.424918 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee" gracePeriod=600 Mar 14 09:01:25 crc kubenswrapper[4956]: I0314 09:01:25.433443 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 09:01:26 crc kubenswrapper[4956]: I0314 09:01:26.262673 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee" exitCode=0 Mar 14 09:01:26 crc kubenswrapper[4956]: I0314 09:01:26.262780 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee"} Mar 14 09:01:26 crc kubenswrapper[4956]: I0314 09:01:26.263102 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"b5af7eb19f784abccebf2c980453895184c4d674e7cf075f6fdc0e70b952b563"} Mar 14 09:01:26 crc kubenswrapper[4956]: I0314 09:01:26.624744 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 09:01:29 crc kubenswrapper[4956]: I0314 09:01:29.374353 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 09:01:29 crc kubenswrapper[4956]: I0314 09:01:29.384574 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 09:01:29 crc kubenswrapper[4956]: I0314 09:01:29.859328 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.047225 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.108731 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.167832 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.175128 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.223251 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.439436 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.702271 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.776164 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.863842 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 09:01:30 crc kubenswrapper[4956]: I0314 09:01:30.970444 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.001430 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.036424 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.113850 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.284884 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.685186 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.768106 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.936235 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.949227 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 09:01:31 crc kubenswrapper[4956]: I0314 09:01:31.992934 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.032060 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.303000 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.352634 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.399541 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.462543 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.640942 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.698870 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.710890 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.782384 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.793459 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.877357 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 09:01:32 crc kubenswrapper[4956]: I0314 09:01:32.995611 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.004887 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.067150 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.139229 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.159447 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.189393 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.201840 4956 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.333980 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.397787 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.403235 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.471249 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.606209 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.663529 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.671067 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.762182 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.797887 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 09:01:33 crc kubenswrapper[4956]: I0314 09:01:33.983649 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.184814 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.193801 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.211292 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.299513 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.305367 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.428973 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.441795 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.472263 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.628761 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.660338 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.670633 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.750224 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 09:01:34 crc kubenswrapper[4956]: I0314 09:01:34.777369 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.174806 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.177234 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.210976 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.266298 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.276870 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.344289 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.345986 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.354456 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.369230 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.380965 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.390056 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.434795 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.585439 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.607700 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.682403 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.721145 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.746389 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.747065 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.747639 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.759514 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.780911 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.921206 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 09:01:35 crc kubenswrapper[4956]: I0314 09:01:35.934271 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.070721 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.079826 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.140807 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.151195 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.218429 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.296086 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.342437 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.347023 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.376447 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.407150 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.413995 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.485610 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.559836 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.693646 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.712972 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.910304 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.950637 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 09:01:36 crc kubenswrapper[4956]: I0314 09:01:36.987049 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.128524 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.191643 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.272630 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.281018 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.309460 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.361793 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.404815 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.430475 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.436998 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.443461 4956 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.468414 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.468524 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.692251 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.712929 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.749522 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.922310 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.922764 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 09:01:37 crc kubenswrapper[4956]: I0314 09:01:37.988786 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.034537 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.036394 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.056157 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.119809 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.257097 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.257214 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.305719 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.478700 4956 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.537617 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.543551 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.603379 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.609468 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.643854 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.644462 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.724959 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.772195 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.782986 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.871197 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.901695 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.915118 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.926991 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 09:01:38 crc kubenswrapper[4956]: I0314 09:01:38.943406 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.065447 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.126360 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.137095 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.293648 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.319414 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.390263 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.435593 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.547144 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.581578 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.610991 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.663656 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.673844 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.832764 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.844069 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 09:01:39 crc kubenswrapper[4956]: I0314 09:01:39.952921 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.060780 4956 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.091435 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.300762 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.339065 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.464273 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.464379 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.591730 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.687870 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.704466 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.770616 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.799699 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.811961 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.829174 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.864700 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 09:01:40 crc kubenswrapper[4956]: I0314 09:01:40.899226 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.042262 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.086297 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.191567 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.214011 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.275453 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.313348 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.352150 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.364288 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.368749 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.445354 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.532732 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.536077 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.643765 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.902074 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 09:01:41 crc kubenswrapper[4956]: I0314 09:01:41.916671 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.023550 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.078116 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.096843 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.172770 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.187063 4956 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.194348 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m4xmw","openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.194443 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn"] Mar 14 09:01:42 crc kubenswrapper[4956]: E0314 09:01:42.194788 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" containerName="installer" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.194822 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" containerName="installer" Mar 14 09:01:42 crc kubenswrapper[4956]: E0314 09:01:42.194848 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f83565-681d-490b-bd27-d21b456c6e25" containerName="oauth-openshift" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.194865 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f83565-681d-490b-bd27-d21b456c6e25" containerName="oauth-openshift" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.195085 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f83565-681d-490b-bd27-d21b456c6e25" containerName="oauth-openshift" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.195130 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab17437b-05d0-4ae8-8d53-2b7eb384acbe" containerName="installer" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.195080 4956 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.195286 4956 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="398abfcc-e8de-4f30-ae7a-f20c3120f379" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.195977 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.212817 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.219655 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.219709 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.220009 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.220061 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.220132 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.220594 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.221338 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.222682 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.224289 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.225463 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.226603 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.226649 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.239225 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.243253 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.250615 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.309440 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.309421104 podStartE2EDuration="24.309421104s" podCreationTimestamp="2026-03-14 09:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:42.302877776 +0000 UTC m=+307.815570054" watchObservedRunningTime="2026-03-14 09:01:42.309421104 +0000 UTC m=+307.822113382" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326669 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326736 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326767 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-session\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326810 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326847 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326870 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-service-ca\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326917 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-audit-policies\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326955 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-router-certs\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.326980 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.327012 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-error\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.327038 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.327061 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw4q4\" (UniqueName: \"kubernetes.io/projected/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-kube-api-access-mw4q4\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.327093 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-login\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.327115 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-audit-dir\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.368732 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428505 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428579 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428614 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-session\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428643 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428672 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428706 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-service-ca\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428756 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-audit-policies\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428805 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-router-certs\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428843 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428890 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-error\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428925 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.428960 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw4q4\" (UniqueName: \"kubernetes.io/projected/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-kube-api-access-mw4q4\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.429461 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-login\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.429513 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-audit-dir\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.429570 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-audit-dir\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.430191 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-service-ca\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.430503 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-audit-policies\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.430613 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.430676 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.435814 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.435831 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-session\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.436177 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.437137 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.437634 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-router-certs\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.437786 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-error\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.438029 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-user-template-login\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.438786 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.446418 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw4q4\" (UniqueName: \"kubernetes.io/projected/0f6b5a5b-c45b-4397-a29a-7424dd06c8ff-kube-api-access-mw4q4\") pod \"oauth-openshift-7987bb8c7b-gzpwn\" (UID: \"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.462936 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.481126 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.515448 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.541617 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.544629 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.586751 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.696239 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.696613 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.734143 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn"] Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.860572 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 09:01:42 crc kubenswrapper[4956]: I0314 09:01:42.890986 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.006445 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.040081 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.104047 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.215668 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f83565-681d-490b-bd27-d21b456c6e25" path="/var/lib/kubelet/pods/22f83565-681d-490b-bd27-d21b456c6e25/volumes" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.325210 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.359318 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" event={"ID":"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff","Type":"ContainerStarted","Data":"82b9e9a4f81b104043ff9f38324e280b929e0dce864a46781883935c551b4858"} Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.359622 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" event={"ID":"0f6b5a5b-c45b-4397-a29a-7424dd06c8ff","Type":"ContainerStarted","Data":"9c4d1f78723c23a451635e0ae51c4cc44147be107003b32d10fbc557279f9d1a"} Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.359719 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.379518 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" podStartSLOduration=56.379495296 podStartE2EDuration="56.379495296s" podCreationTimestamp="2026-03-14 09:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:43.376619572 +0000 UTC m=+308.889311830" watchObservedRunningTime="2026-03-14 09:01:43.379495296 +0000 UTC m=+308.892187574" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.430810 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.733431 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7987bb8c7b-gzpwn" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.899783 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.938642 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 09:01:43 crc kubenswrapper[4956]: I0314 09:01:43.942349 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.007444 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.045533 4956 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.085258 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.143132 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.219226 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.268793 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.342852 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.348442 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.422469 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.427980 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.442111 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 09:01:44 crc kubenswrapper[4956]: I0314 09:01:44.724756 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 09:01:45 crc kubenswrapper[4956]: I0314 09:01:45.168112 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 09:01:45 crc kubenswrapper[4956]: I0314 09:01:45.192846 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 09:01:45 crc kubenswrapper[4956]: I0314 09:01:45.315888 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 09:01:45 crc kubenswrapper[4956]: I0314 09:01:45.708247 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 09:01:45 crc kubenswrapper[4956]: I0314 09:01:45.777249 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 09:01:46 crc kubenswrapper[4956]: I0314 09:01:46.064911 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 09:01:46 crc kubenswrapper[4956]: I0314 09:01:46.650602 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 09:01:46 crc kubenswrapper[4956]: I0314 09:01:46.893543 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 09:01:46 crc kubenswrapper[4956]: I0314 09:01:46.980958 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 09:01:47 crc kubenswrapper[4956]: I0314 09:01:47.224340 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 09:01:47 crc kubenswrapper[4956]: I0314 09:01:47.458909 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 09:01:48 crc kubenswrapper[4956]: I0314 09:01:48.028130 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.052555 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7986466d5c-5v6d7"] Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.053036 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" podUID="aec6ab77-fd2d-4f81-8721-7440032c9f24" containerName="controller-manager" containerID="cri-o://38ef84b9567c4183943626c57109c3afa6f7da31552e7a5ac819292bf95fc7be" gracePeriod=30 Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.155012 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f"] Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.155221 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" podUID="3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" containerName="route-controller-manager" containerID="cri-o://affecb3b2c1ef39c11cbab889919b73267fdb9b6b7d569336f05676141acb51d" gracePeriod=30 Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.409681 4956 generic.go:334] "Generic (PLEG): container finished" podID="aec6ab77-fd2d-4f81-8721-7440032c9f24" containerID="38ef84b9567c4183943626c57109c3afa6f7da31552e7a5ac819292bf95fc7be" exitCode=0 Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.409744 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" event={"ID":"aec6ab77-fd2d-4f81-8721-7440032c9f24","Type":"ContainerDied","Data":"38ef84b9567c4183943626c57109c3afa6f7da31552e7a5ac819292bf95fc7be"} Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.410115 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" event={"ID":"aec6ab77-fd2d-4f81-8721-7440032c9f24","Type":"ContainerDied","Data":"fb9903c5bdb4b0a2bb93bb803335931237bee750d112b0884c08b50d65d6afc4"} Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.410138 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb9903c5bdb4b0a2bb93bb803335931237bee750d112b0884c08b50d65d6afc4" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.412046 4956 generic.go:334] "Generic (PLEG): container finished" podID="3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" containerID="affecb3b2c1ef39c11cbab889919b73267fdb9b6b7d569336f05676141acb51d" exitCode=0 Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.412103 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" event={"ID":"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454","Type":"ContainerDied","Data":"affecb3b2c1ef39c11cbab889919b73267fdb9b6b7d569336f05676141acb51d"} Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.429597 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.453580 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-proxy-ca-bundles\") pod \"aec6ab77-fd2d-4f81-8721-7440032c9f24\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.453656 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-client-ca\") pod \"aec6ab77-fd2d-4f81-8721-7440032c9f24\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.453684 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jvgg\" (UniqueName: \"kubernetes.io/projected/aec6ab77-fd2d-4f81-8721-7440032c9f24-kube-api-access-7jvgg\") pod \"aec6ab77-fd2d-4f81-8721-7440032c9f24\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.453711 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-config\") pod \"aec6ab77-fd2d-4f81-8721-7440032c9f24\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.453776 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aec6ab77-fd2d-4f81-8721-7440032c9f24-serving-cert\") pod \"aec6ab77-fd2d-4f81-8721-7440032c9f24\" (UID: \"aec6ab77-fd2d-4f81-8721-7440032c9f24\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.454292 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aec6ab77-fd2d-4f81-8721-7440032c9f24" (UID: "aec6ab77-fd2d-4f81-8721-7440032c9f24"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.455264 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-client-ca" (OuterVolumeSpecName: "client-ca") pod "aec6ab77-fd2d-4f81-8721-7440032c9f24" (UID: "aec6ab77-fd2d-4f81-8721-7440032c9f24"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.455630 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-config" (OuterVolumeSpecName: "config") pod "aec6ab77-fd2d-4f81-8721-7440032c9f24" (UID: "aec6ab77-fd2d-4f81-8721-7440032c9f24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.459497 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec6ab77-fd2d-4f81-8721-7440032c9f24-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aec6ab77-fd2d-4f81-8721-7440032c9f24" (UID: "aec6ab77-fd2d-4f81-8721-7440032c9f24"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.460312 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec6ab77-fd2d-4f81-8721-7440032c9f24-kube-api-access-7jvgg" (OuterVolumeSpecName: "kube-api-access-7jvgg") pod "aec6ab77-fd2d-4f81-8721-7440032c9f24" (UID: "aec6ab77-fd2d-4f81-8721-7440032c9f24"). InnerVolumeSpecName "kube-api-access-7jvgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.490281 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.554967 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-client-ca\") pod \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555017 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-config\") pod \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555043 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-serving-cert\") pod \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555094 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2vbc\" (UniqueName: \"kubernetes.io/projected/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-kube-api-access-v2vbc\") pod \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\" (UID: \"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454\") " Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555240 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555252 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aec6ab77-fd2d-4f81-8721-7440032c9f24-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555260 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555269 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aec6ab77-fd2d-4f81-8721-7440032c9f24-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.555277 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jvgg\" (UniqueName: \"kubernetes.io/projected/aec6ab77-fd2d-4f81-8721-7440032c9f24-kube-api-access-7jvgg\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.556039 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" (UID: "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.556142 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-config" (OuterVolumeSpecName: "config") pod "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" (UID: "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.558247 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" (UID: "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.558437 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-kube-api-access-v2vbc" (OuterVolumeSpecName: "kube-api-access-v2vbc") pod "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" (UID: "3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454"). InnerVolumeSpecName "kube-api-access-v2vbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.656408 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.656452 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.656462 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.656471 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2vbc\" (UniqueName: \"kubernetes.io/projected/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454-kube-api-access-v2vbc\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.852764 4956 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 09:01:51 crc kubenswrapper[4956]: I0314 09:01:51.853192 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2" gracePeriod=5 Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.417928 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7986466d5c-5v6d7" Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.417974 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.417973 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f" event={"ID":"3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454","Type":"ContainerDied","Data":"8d705fe1576fd29e9844ba445d7643307939c6b2d08aa7c92ebddbfcd5360d58"} Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.418041 4956 scope.go:117] "RemoveContainer" containerID="affecb3b2c1ef39c11cbab889919b73267fdb9b6b7d569336f05676141acb51d" Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.453282 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7986466d5c-5v6d7"] Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.459039 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7986466d5c-5v6d7"] Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.472739 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f"] Mar 14 09:01:52 crc kubenswrapper[4956]: I0314 09:01:52.478667 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c7c6576c-grw8f"] Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066093 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-lmdq5"] Mar 14 09:01:53 crc kubenswrapper[4956]: E0314 09:01:53.066416 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066449 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 09:01:53 crc kubenswrapper[4956]: E0314 09:01:53.066474 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" containerName="route-controller-manager" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066520 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" containerName="route-controller-manager" Mar 14 09:01:53 crc kubenswrapper[4956]: E0314 09:01:53.066565 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec6ab77-fd2d-4f81-8721-7440032c9f24" containerName="controller-manager" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066585 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec6ab77-fd2d-4f81-8721-7440032c9f24" containerName="controller-manager" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066827 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec6ab77-fd2d-4f81-8721-7440032c9f24" containerName="controller-manager" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066875 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.066905 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" containerName="route-controller-manager" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.067642 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.069568 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.069919 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.070236 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn"] Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.071404 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.072064 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-proxy-ca-bundles\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.072100 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-client-ca\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.072141 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-config\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.072227 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv7hl\" (UniqueName: \"kubernetes.io/projected/74a60b5e-a9a1-429d-a156-6e13e3a12266-kube-api-access-bv7hl\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.072325 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74a60b5e-a9a1-429d-a156-6e13e3a12266-serving-cert\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.073327 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.073478 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.074795 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.074863 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.074877 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.074795 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.074913 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.074917 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.075028 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.076227 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.087136 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.098019 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn"] Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.102360 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-lmdq5"] Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.173875 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74a60b5e-a9a1-429d-a156-6e13e3a12266-serving-cert\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.173931 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-client-ca\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.173959 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-proxy-ca-bundles\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.173987 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-client-ca\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.174031 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b77e719-50a1-432c-8701-8fb35b9c68b8-serving-cert\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.174051 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbnv\" (UniqueName: \"kubernetes.io/projected/1b77e719-50a1-432c-8701-8fb35b9c68b8-kube-api-access-rzbnv\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.174075 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-config\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.174095 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv7hl\" (UniqueName: \"kubernetes.io/projected/74a60b5e-a9a1-429d-a156-6e13e3a12266-kube-api-access-bv7hl\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.174111 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-config\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.174976 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-proxy-ca-bundles\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.175156 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-client-ca\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.175844 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-config\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.183973 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74a60b5e-a9a1-429d-a156-6e13e3a12266-serving-cert\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.193663 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv7hl\" (UniqueName: \"kubernetes.io/projected/74a60b5e-a9a1-429d-a156-6e13e3a12266-kube-api-access-bv7hl\") pod \"controller-manager-84f99cfc54-lmdq5\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.216567 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454" path="/var/lib/kubelet/pods/3c2f0016-a8bc-41d2-8f8d-4f6fbb1bb454/volumes" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.217298 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec6ab77-fd2d-4f81-8721-7440032c9f24" path="/var/lib/kubelet/pods/aec6ab77-fd2d-4f81-8721-7440032c9f24/volumes" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.275184 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-client-ca\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.275241 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b77e719-50a1-432c-8701-8fb35b9c68b8-serving-cert\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.275262 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbnv\" (UniqueName: \"kubernetes.io/projected/1b77e719-50a1-432c-8701-8fb35b9c68b8-kube-api-access-rzbnv\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.275289 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-config\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.276218 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-config\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.277170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-client-ca\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.279425 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b77e719-50a1-432c-8701-8fb35b9c68b8-serving-cert\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.296976 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbnv\" (UniqueName: \"kubernetes.io/projected/1b77e719-50a1-432c-8701-8fb35b9c68b8-kube-api-access-rzbnv\") pod \"route-controller-manager-dfb87fb89-mcdtn\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.386251 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.397622 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.605754 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn"] Mar 14 09:01:53 crc kubenswrapper[4956]: W0314 09:01:53.614750 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b77e719_50a1_432c_8701_8fb35b9c68b8.slice/crio-c749dcbd757c7ce0c885d9e6f489b89c1bbecabaa2a300f9539ea2e7b4805405 WatchSource:0}: Error finding container c749dcbd757c7ce0c885d9e6f489b89c1bbecabaa2a300f9539ea2e7b4805405: Status 404 returned error can't find the container with id c749dcbd757c7ce0c885d9e6f489b89c1bbecabaa2a300f9539ea2e7b4805405 Mar 14 09:01:53 crc kubenswrapper[4956]: I0314 09:01:53.889394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-lmdq5"] Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.434410 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" event={"ID":"1b77e719-50a1-432c-8701-8fb35b9c68b8","Type":"ContainerStarted","Data":"6309259bef888e684582ef404cb0a4ae6b0eb0abb101ed31f22bffb132731733"} Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.434440 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" event={"ID":"1b77e719-50a1-432c-8701-8fb35b9c68b8","Type":"ContainerStarted","Data":"c749dcbd757c7ce0c885d9e6f489b89c1bbecabaa2a300f9539ea2e7b4805405"} Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.434740 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.436011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" event={"ID":"74a60b5e-a9a1-429d-a156-6e13e3a12266","Type":"ContainerStarted","Data":"80a2f67cd9edf904699c6713f92343ed2623bbd92701ce1648115725bfb15e4c"} Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.436033 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" event={"ID":"74a60b5e-a9a1-429d-a156-6e13e3a12266","Type":"ContainerStarted","Data":"670ec2d6fad04475871bfcd6981ce714fd3f35699bd7ff5b3f94dfc5d1c58d5f"} Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.436511 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.439423 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.440976 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.480102 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" podStartSLOduration=3.480082776 podStartE2EDuration="3.480082776s" podCreationTimestamp="2026-03-14 09:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:54.457800233 +0000 UTC m=+319.970492501" watchObservedRunningTime="2026-03-14 09:01:54.480082776 +0000 UTC m=+319.992775044" Mar 14 09:01:54 crc kubenswrapper[4956]: I0314 09:01:54.483145 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" podStartSLOduration=3.483131684 podStartE2EDuration="3.483131684s" podCreationTimestamp="2026-03-14 09:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:54.482535469 +0000 UTC m=+319.995227747" watchObservedRunningTime="2026-03-14 09:01:54.483131684 +0000 UTC m=+319.995823952" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.448382 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.449021 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.453236 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.453301 4956 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2" exitCode=137 Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.453355 4956 scope.go:117] "RemoveContainer" containerID="aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.453411 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.472074 4956 scope.go:117] "RemoveContainer" containerID="aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2" Mar 14 09:01:57 crc kubenswrapper[4956]: E0314 09:01:57.472437 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2\": container with ID starting with aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2 not found: ID does not exist" containerID="aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.472475 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2"} err="failed to get container status \"aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2\": rpc error: code = NotFound desc = could not find container \"aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2\": container with ID starting with aa3e6b25c08ed5bb0517e49549c3a64499d062263b1b177af6a78b364b1066c2 not found: ID does not exist" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.526863 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.526966 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527035 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527091 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527065 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527122 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527185 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527165 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.527229 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.528355 4956 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.528418 4956 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.528445 4956 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.528472 4956 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.539084 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:01:57 crc kubenswrapper[4956]: I0314 09:01:57.629467 4956 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:59 crc kubenswrapper[4956]: I0314 09:01:59.221784 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.172114 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557982-vxp8p"] Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.173773 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.178540 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.178694 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.178810 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.203262 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-vxp8p"] Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.360773 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhp2\" (UniqueName: \"kubernetes.io/projected/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab-kube-api-access-ffhp2\") pod \"auto-csr-approver-29557982-vxp8p\" (UID: \"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab\") " pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.462217 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhp2\" (UniqueName: \"kubernetes.io/projected/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab-kube-api-access-ffhp2\") pod \"auto-csr-approver-29557982-vxp8p\" (UID: \"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab\") " pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.482632 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhp2\" (UniqueName: \"kubernetes.io/projected/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab-kube-api-access-ffhp2\") pod \"auto-csr-approver-29557982-vxp8p\" (UID: \"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab\") " pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.496250 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:00 crc kubenswrapper[4956]: I0314 09:02:00.893733 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-vxp8p"] Mar 14 09:02:01 crc kubenswrapper[4956]: I0314 09:02:01.476877 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" event={"ID":"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab","Type":"ContainerStarted","Data":"db6f9d2fc7e49e34efa58c7c625d42d8a212578e733b3a74e78481444af590ce"} Mar 14 09:02:02 crc kubenswrapper[4956]: I0314 09:02:02.483700 4956 generic.go:334] "Generic (PLEG): container finished" podID="a6f200e5-e07e-4c92-a04a-1ed65b8e44ab" containerID="8eadb51a233f4b014f70bbfe5913d9b0ff694d111108f7fd2a90cb9e27e5161f" exitCode=0 Mar 14 09:02:02 crc kubenswrapper[4956]: I0314 09:02:02.483745 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" event={"ID":"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab","Type":"ContainerDied","Data":"8eadb51a233f4b014f70bbfe5913d9b0ff694d111108f7fd2a90cb9e27e5161f"} Mar 14 09:02:03 crc kubenswrapper[4956]: I0314 09:02:03.804676 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:04 crc kubenswrapper[4956]: I0314 09:02:04.002773 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffhp2\" (UniqueName: \"kubernetes.io/projected/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab-kube-api-access-ffhp2\") pod \"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab\" (UID: \"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab\") " Mar 14 09:02:04 crc kubenswrapper[4956]: I0314 09:02:04.010707 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab-kube-api-access-ffhp2" (OuterVolumeSpecName: "kube-api-access-ffhp2") pod "a6f200e5-e07e-4c92-a04a-1ed65b8e44ab" (UID: "a6f200e5-e07e-4c92-a04a-1ed65b8e44ab"). InnerVolumeSpecName "kube-api-access-ffhp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:04 crc kubenswrapper[4956]: I0314 09:02:04.103845 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffhp2\" (UniqueName: \"kubernetes.io/projected/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab-kube-api-access-ffhp2\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:04 crc kubenswrapper[4956]: I0314 09:02:04.501552 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" event={"ID":"a6f200e5-e07e-4c92-a04a-1ed65b8e44ab","Type":"ContainerDied","Data":"db6f9d2fc7e49e34efa58c7c625d42d8a212578e733b3a74e78481444af590ce"} Mar 14 09:02:04 crc kubenswrapper[4956]: I0314 09:02:04.501612 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db6f9d2fc7e49e34efa58c7c625d42d8a212578e733b3a74e78481444af590ce" Mar 14 09:02:04 crc kubenswrapper[4956]: I0314 09:02:04.501661 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-vxp8p" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.073237 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-lmdq5"] Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.074137 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" podUID="74a60b5e-a9a1-429d-a156-6e13e3a12266" containerName="controller-manager" containerID="cri-o://80a2f67cd9edf904699c6713f92343ed2623bbd92701ce1648115725bfb15e4c" gracePeriod=30 Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.082351 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn"] Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.082658 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" podUID="1b77e719-50a1-432c-8701-8fb35b9c68b8" containerName="route-controller-manager" containerID="cri-o://6309259bef888e684582ef404cb0a4ae6b0eb0abb101ed31f22bffb132731733" gracePeriod=30 Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.540523 4956 generic.go:334] "Generic (PLEG): container finished" podID="74a60b5e-a9a1-429d-a156-6e13e3a12266" containerID="80a2f67cd9edf904699c6713f92343ed2623bbd92701ce1648115725bfb15e4c" exitCode=0 Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.540562 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" event={"ID":"74a60b5e-a9a1-429d-a156-6e13e3a12266","Type":"ContainerDied","Data":"80a2f67cd9edf904699c6713f92343ed2623bbd92701ce1648115725bfb15e4c"} Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.544707 4956 generic.go:334] "Generic (PLEG): container finished" podID="1b77e719-50a1-432c-8701-8fb35b9c68b8" containerID="6309259bef888e684582ef404cb0a4ae6b0eb0abb101ed31f22bffb132731733" exitCode=0 Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.544754 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" event={"ID":"1b77e719-50a1-432c-8701-8fb35b9c68b8","Type":"ContainerDied","Data":"6309259bef888e684582ef404cb0a4ae6b0eb0abb101ed31f22bffb132731733"} Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.589830 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.593831 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzbnv\" (UniqueName: \"kubernetes.io/projected/1b77e719-50a1-432c-8701-8fb35b9c68b8-kube-api-access-rzbnv\") pod \"1b77e719-50a1-432c-8701-8fb35b9c68b8\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.593904 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b77e719-50a1-432c-8701-8fb35b9c68b8-serving-cert\") pod \"1b77e719-50a1-432c-8701-8fb35b9c68b8\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.593965 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-config\") pod \"1b77e719-50a1-432c-8701-8fb35b9c68b8\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.594064 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-client-ca\") pod \"1b77e719-50a1-432c-8701-8fb35b9c68b8\" (UID: \"1b77e719-50a1-432c-8701-8fb35b9c68b8\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.594921 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b77e719-50a1-432c-8701-8fb35b9c68b8" (UID: "1b77e719-50a1-432c-8701-8fb35b9c68b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.595401 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-config" (OuterVolumeSpecName: "config") pod "1b77e719-50a1-432c-8701-8fb35b9c68b8" (UID: "1b77e719-50a1-432c-8701-8fb35b9c68b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.599076 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b77e719-50a1-432c-8701-8fb35b9c68b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b77e719-50a1-432c-8701-8fb35b9c68b8" (UID: "1b77e719-50a1-432c-8701-8fb35b9c68b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.599606 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b77e719-50a1-432c-8701-8fb35b9c68b8-kube-api-access-rzbnv" (OuterVolumeSpecName: "kube-api-access-rzbnv") pod "1b77e719-50a1-432c-8701-8fb35b9c68b8" (UID: "1b77e719-50a1-432c-8701-8fb35b9c68b8"). InnerVolumeSpecName "kube-api-access-rzbnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.651551 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695387 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-proxy-ca-bundles\") pod \"74a60b5e-a9a1-429d-a156-6e13e3a12266\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695512 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74a60b5e-a9a1-429d-a156-6e13e3a12266-serving-cert\") pod \"74a60b5e-a9a1-429d-a156-6e13e3a12266\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695543 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-client-ca\") pod \"74a60b5e-a9a1-429d-a156-6e13e3a12266\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695574 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-config\") pod \"74a60b5e-a9a1-429d-a156-6e13e3a12266\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695630 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv7hl\" (UniqueName: \"kubernetes.io/projected/74a60b5e-a9a1-429d-a156-6e13e3a12266-kube-api-access-bv7hl\") pod \"74a60b5e-a9a1-429d-a156-6e13e3a12266\" (UID: \"74a60b5e-a9a1-429d-a156-6e13e3a12266\") " Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695864 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695882 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b77e719-50a1-432c-8701-8fb35b9c68b8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695897 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzbnv\" (UniqueName: \"kubernetes.io/projected/1b77e719-50a1-432c-8701-8fb35b9c68b8-kube-api-access-rzbnv\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.695909 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b77e719-50a1-432c-8701-8fb35b9c68b8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.696580 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-config" (OuterVolumeSpecName: "config") pod "74a60b5e-a9a1-429d-a156-6e13e3a12266" (UID: "74a60b5e-a9a1-429d-a156-6e13e3a12266"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.696679 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-client-ca" (OuterVolumeSpecName: "client-ca") pod "74a60b5e-a9a1-429d-a156-6e13e3a12266" (UID: "74a60b5e-a9a1-429d-a156-6e13e3a12266"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.696942 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "74a60b5e-a9a1-429d-a156-6e13e3a12266" (UID: "74a60b5e-a9a1-429d-a156-6e13e3a12266"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.698900 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a60b5e-a9a1-429d-a156-6e13e3a12266-kube-api-access-bv7hl" (OuterVolumeSpecName: "kube-api-access-bv7hl") pod "74a60b5e-a9a1-429d-a156-6e13e3a12266" (UID: "74a60b5e-a9a1-429d-a156-6e13e3a12266"). InnerVolumeSpecName "kube-api-access-bv7hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.699150 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a60b5e-a9a1-429d-a156-6e13e3a12266-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "74a60b5e-a9a1-429d-a156-6e13e3a12266" (UID: "74a60b5e-a9a1-429d-a156-6e13e3a12266"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.797549 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74a60b5e-a9a1-429d-a156-6e13e3a12266-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.797586 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.797598 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.797612 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv7hl\" (UniqueName: \"kubernetes.io/projected/74a60b5e-a9a1-429d-a156-6e13e3a12266-kube-api-access-bv7hl\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[4956]: I0314 09:02:11.797622 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74a60b5e-a9a1-429d-a156-6e13e3a12266-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.552691 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.552757 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn" event={"ID":"1b77e719-50a1-432c-8701-8fb35b9c68b8","Type":"ContainerDied","Data":"c749dcbd757c7ce0c885d9e6f489b89c1bbecabaa2a300f9539ea2e7b4805405"} Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.553159 4956 scope.go:117] "RemoveContainer" containerID="6309259bef888e684582ef404cb0a4ae6b0eb0abb101ed31f22bffb132731733" Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.555436 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" event={"ID":"74a60b5e-a9a1-429d-a156-6e13e3a12266","Type":"ContainerDied","Data":"670ec2d6fad04475871bfcd6981ce714fd3f35699bd7ff5b3f94dfc5d1c58d5f"} Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.555564 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f99cfc54-lmdq5" Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.585070 4956 scope.go:117] "RemoveContainer" containerID="80a2f67cd9edf904699c6713f92343ed2623bbd92701ce1648115725bfb15e4c" Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.586692 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn"] Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.593098 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-mcdtn"] Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.601377 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-lmdq5"] Mar 14 09:02:12 crc kubenswrapper[4956]: I0314 09:02:12.605289 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-lmdq5"] Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081331 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-869d66459b-c8w74"] Mar 14 09:02:13 crc kubenswrapper[4956]: E0314 09:02:13.081704 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f200e5-e07e-4c92-a04a-1ed65b8e44ab" containerName="oc" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081723 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f200e5-e07e-4c92-a04a-1ed65b8e44ab" containerName="oc" Mar 14 09:02:13 crc kubenswrapper[4956]: E0314 09:02:13.081739 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a60b5e-a9a1-429d-a156-6e13e3a12266" containerName="controller-manager" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081748 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a60b5e-a9a1-429d-a156-6e13e3a12266" containerName="controller-manager" Mar 14 09:02:13 crc kubenswrapper[4956]: E0314 09:02:13.081771 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b77e719-50a1-432c-8701-8fb35b9c68b8" containerName="route-controller-manager" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081780 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b77e719-50a1-432c-8701-8fb35b9c68b8" containerName="route-controller-manager" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081901 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f200e5-e07e-4c92-a04a-1ed65b8e44ab" containerName="oc" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081915 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b77e719-50a1-432c-8701-8fb35b9c68b8" containerName="route-controller-manager" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.081929 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a60b5e-a9a1-429d-a156-6e13e3a12266" containerName="controller-manager" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.083577 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.093168 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.093249 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.093299 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.093413 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.093606 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.093788 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.094117 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776"] Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.095899 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.100228 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.100517 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.100649 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.102007 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.103225 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.103357 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.106136 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110296 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-client-ca\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110333 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-client-ca\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110354 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-serving-cert\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110374 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8sl\" (UniqueName: \"kubernetes.io/projected/ed54574a-c344-4f93-bd94-7472a0deb022-kube-api-access-9g8sl\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110399 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vt2\" (UniqueName: \"kubernetes.io/projected/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-kube-api-access-h5vt2\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110418 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-config\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed54574a-c344-4f93-bd94-7472a0deb022-serving-cert\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110619 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-proxy-ca-bundles\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.110666 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-config\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.131255 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776"] Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.135618 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-869d66459b-c8w74"] Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.210949 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-config\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211014 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-client-ca\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211041 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-client-ca\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211063 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-serving-cert\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211087 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8sl\" (UniqueName: \"kubernetes.io/projected/ed54574a-c344-4f93-bd94-7472a0deb022-kube-api-access-9g8sl\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211121 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vt2\" (UniqueName: \"kubernetes.io/projected/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-kube-api-access-h5vt2\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-config\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211215 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed54574a-c344-4f93-bd94-7472a0deb022-serving-cert\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.211274 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-proxy-ca-bundles\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.212420 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-proxy-ca-bundles\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.213548 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-config\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.214203 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-client-ca\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.214960 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-client-ca\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.215230 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-config\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.218432 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b77e719-50a1-432c-8701-8fb35b9c68b8" path="/var/lib/kubelet/pods/1b77e719-50a1-432c-8701-8fb35b9c68b8/volumes" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.218983 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a60b5e-a9a1-429d-a156-6e13e3a12266" path="/var/lib/kubelet/pods/74a60b5e-a9a1-429d-a156-6e13e3a12266/volumes" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.222470 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-serving-cert\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.234125 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed54574a-c344-4f93-bd94-7472a0deb022-serving-cert\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.234647 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vt2\" (UniqueName: \"kubernetes.io/projected/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-kube-api-access-h5vt2\") pod \"route-controller-manager-7b7cd5c44b-ml776\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.237537 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8sl\" (UniqueName: \"kubernetes.io/projected/ed54574a-c344-4f93-bd94-7472a0deb022-kube-api-access-9g8sl\") pod \"controller-manager-869d66459b-c8w74\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.412540 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.434138 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.644146 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-s8rkh"] Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.644775 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.661232 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-s8rkh"] Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.682541 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776"] Mar 14 09:02:13 crc kubenswrapper[4956]: W0314 09:02:13.685887 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f65f3f5_6345_4702_9c65_3c1fe42a36f6.slice/crio-ab5524763c642061905bb07a690df58fcfcda816953223c274727a982badd0ee WatchSource:0}: Error finding container ab5524763c642061905bb07a690df58fcfcda816953223c274727a982badd0ee: Status 404 returned error can't find the container with id ab5524763c642061905bb07a690df58fcfcda816953223c274727a982badd0ee Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818022 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmxmt\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-kube-api-access-rmxmt\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818418 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818460 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-bound-sa-token\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818515 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-registry-certificates\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818551 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-trusted-ca\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818606 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-registry-tls\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.818640 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.835079 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-869d66459b-c8w74"] Mar 14 09:02:13 crc kubenswrapper[4956]: W0314 09:02:13.841213 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded54574a_c344_4f93_bd94_7472a0deb022.slice/crio-fb62c70153e9c71a32f701d6d8e97d435c317d111f5f8edaa76397207dec6d65 WatchSource:0}: Error finding container fb62c70153e9c71a32f701d6d8e97d435c317d111f5f8edaa76397207dec6d65: Status 404 returned error can't find the container with id fb62c70153e9c71a32f701d6d8e97d435c317d111f5f8edaa76397207dec6d65 Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.842763 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.919841 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-registry-certificates\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.919899 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-trusted-ca\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.919931 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.919955 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-registry-tls\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.919981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmxmt\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-kube-api-access-rmxmt\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.920015 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.920040 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-bound-sa-token\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.921565 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-ca-trust-extracted\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.922289 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-trusted-ca\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.922467 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-registry-certificates\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.926533 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-registry-tls\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.926764 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-installation-pull-secrets\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.941395 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-bound-sa-token\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.942376 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmxmt\" (UniqueName: \"kubernetes.io/projected/a77d4adc-529a-4d32-93d9-ef0ee376a4ec-kube-api-access-rmxmt\") pod \"image-registry-66df7c8f76-s8rkh\" (UID: \"a77d4adc-529a-4d32-93d9-ef0ee376a4ec\") " pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:13 crc kubenswrapper[4956]: I0314 09:02:13.963532 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.145269 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-s8rkh"] Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.574125 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" event={"ID":"a77d4adc-529a-4d32-93d9-ef0ee376a4ec","Type":"ContainerStarted","Data":"c3ed9154f49a671d8b2f84069cc1ede689bc1a8d52d067afffe04c54179f20b1"} Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.574421 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" event={"ID":"a77d4adc-529a-4d32-93d9-ef0ee376a4ec","Type":"ContainerStarted","Data":"4e080cc1f51d41f3e7af0801025b65a6a904f4556e7bdb4d75b905297be62543"} Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.574438 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.576096 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" event={"ID":"ed54574a-c344-4f93-bd94-7472a0deb022","Type":"ContainerStarted","Data":"3d90cb11021dcad0c8317d73768479a5e31276abcc9d4859fd0be78315698569"} Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.576134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" event={"ID":"ed54574a-c344-4f93-bd94-7472a0deb022","Type":"ContainerStarted","Data":"fb62c70153e9c71a32f701d6d8e97d435c317d111f5f8edaa76397207dec6d65"} Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.576291 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.579004 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" event={"ID":"3f65f3f5-6345-4702-9c65-3c1fe42a36f6","Type":"ContainerStarted","Data":"bb6518e780204b6f4d2781048c3386bfb7a58fbbfe600bf0f999d0c93ff5bd1d"} Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.579028 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" event={"ID":"3f65f3f5-6345-4702-9c65-3c1fe42a36f6","Type":"ContainerStarted","Data":"ab5524763c642061905bb07a690df58fcfcda816953223c274727a982badd0ee"} Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.579218 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.581299 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.584069 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.592984 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" podStartSLOduration=1.592970696 podStartE2EDuration="1.592970696s" podCreationTimestamp="2026-03-14 09:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:14.590948413 +0000 UTC m=+340.103640681" watchObservedRunningTime="2026-03-14 09:02:14.592970696 +0000 UTC m=+340.105662964" Mar 14 09:02:14 crc kubenswrapper[4956]: I0314 09:02:14.612747 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" podStartSLOduration=3.612732331 podStartE2EDuration="3.612732331s" podCreationTimestamp="2026-03-14 09:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:14.608311966 +0000 UTC m=+340.121004244" watchObservedRunningTime="2026-03-14 09:02:14.612732331 +0000 UTC m=+340.125424599" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.084367 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" podStartSLOduration=20.084347905 podStartE2EDuration="20.084347905s" podCreationTimestamp="2026-03-14 09:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:14.626793608 +0000 UTC m=+340.139485876" watchObservedRunningTime="2026-03-14 09:02:31.084347905 +0000 UTC m=+356.597040173" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.085077 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-869d66459b-c8w74"] Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.085249 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" podUID="ed54574a-c344-4f93-bd94-7472a0deb022" containerName="controller-manager" containerID="cri-o://3d90cb11021dcad0c8317d73768479a5e31276abcc9d4859fd0be78315698569" gracePeriod=30 Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.104766 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776"] Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.105280 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" podUID="3f65f3f5-6345-4702-9c65-3c1fe42a36f6" containerName="route-controller-manager" containerID="cri-o://bb6518e780204b6f4d2781048c3386bfb7a58fbbfe600bf0f999d0c93ff5bd1d" gracePeriod=30 Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.675003 4956 generic.go:334] "Generic (PLEG): container finished" podID="3f65f3f5-6345-4702-9c65-3c1fe42a36f6" containerID="bb6518e780204b6f4d2781048c3386bfb7a58fbbfe600bf0f999d0c93ff5bd1d" exitCode=0 Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.675290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" event={"ID":"3f65f3f5-6345-4702-9c65-3c1fe42a36f6","Type":"ContainerDied","Data":"bb6518e780204b6f4d2781048c3386bfb7a58fbbfe600bf0f999d0c93ff5bd1d"} Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.675315 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" event={"ID":"3f65f3f5-6345-4702-9c65-3c1fe42a36f6","Type":"ContainerDied","Data":"ab5524763c642061905bb07a690df58fcfcda816953223c274727a982badd0ee"} Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.675324 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5524763c642061905bb07a690df58fcfcda816953223c274727a982badd0ee" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.676607 4956 generic.go:334] "Generic (PLEG): container finished" podID="ed54574a-c344-4f93-bd94-7472a0deb022" containerID="3d90cb11021dcad0c8317d73768479a5e31276abcc9d4859fd0be78315698569" exitCode=0 Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.676631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" event={"ID":"ed54574a-c344-4f93-bd94-7472a0deb022","Type":"ContainerDied","Data":"3d90cb11021dcad0c8317d73768479a5e31276abcc9d4859fd0be78315698569"} Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.691997 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.710155 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848653 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed54574a-c344-4f93-bd94-7472a0deb022-serving-cert\") pod \"ed54574a-c344-4f93-bd94-7472a0deb022\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848706 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-client-ca\") pod \"ed54574a-c344-4f93-bd94-7472a0deb022\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848783 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-client-ca\") pod \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848813 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-proxy-ca-bundles\") pod \"ed54574a-c344-4f93-bd94-7472a0deb022\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848843 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-serving-cert\") pod \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848892 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-config\") pod \"ed54574a-c344-4f93-bd94-7472a0deb022\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848918 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-config\") pod \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848941 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g8sl\" (UniqueName: \"kubernetes.io/projected/ed54574a-c344-4f93-bd94-7472a0deb022-kube-api-access-9g8sl\") pod \"ed54574a-c344-4f93-bd94-7472a0deb022\" (UID: \"ed54574a-c344-4f93-bd94-7472a0deb022\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.848980 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5vt2\" (UniqueName: \"kubernetes.io/projected/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-kube-api-access-h5vt2\") pod \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\" (UID: \"3f65f3f5-6345-4702-9c65-3c1fe42a36f6\") " Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.851011 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ed54574a-c344-4f93-bd94-7472a0deb022" (UID: "ed54574a-c344-4f93-bd94-7472a0deb022"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.851045 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-config" (OuterVolumeSpecName: "config") pod "ed54574a-c344-4f93-bd94-7472a0deb022" (UID: "ed54574a-c344-4f93-bd94-7472a0deb022"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.851435 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed54574a-c344-4f93-bd94-7472a0deb022" (UID: "ed54574a-c344-4f93-bd94-7472a0deb022"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.851671 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-config" (OuterVolumeSpecName: "config") pod "3f65f3f5-6345-4702-9c65-3c1fe42a36f6" (UID: "3f65f3f5-6345-4702-9c65-3c1fe42a36f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.851710 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f65f3f5-6345-4702-9c65-3c1fe42a36f6" (UID: "3f65f3f5-6345-4702-9c65-3c1fe42a36f6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.855450 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-kube-api-access-h5vt2" (OuterVolumeSpecName: "kube-api-access-h5vt2") pod "3f65f3f5-6345-4702-9c65-3c1fe42a36f6" (UID: "3f65f3f5-6345-4702-9c65-3c1fe42a36f6"). InnerVolumeSpecName "kube-api-access-h5vt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.855517 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed54574a-c344-4f93-bd94-7472a0deb022-kube-api-access-9g8sl" (OuterVolumeSpecName: "kube-api-access-9g8sl") pod "ed54574a-c344-4f93-bd94-7472a0deb022" (UID: "ed54574a-c344-4f93-bd94-7472a0deb022"). InnerVolumeSpecName "kube-api-access-9g8sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.858600 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f65f3f5-6345-4702-9c65-3c1fe42a36f6" (UID: "3f65f3f5-6345-4702-9c65-3c1fe42a36f6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.858611 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed54574a-c344-4f93-bd94-7472a0deb022-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed54574a-c344-4f93-bd94-7472a0deb022" (UID: "ed54574a-c344-4f93-bd94-7472a0deb022"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950107 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950136 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950146 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g8sl\" (UniqueName: \"kubernetes.io/projected/ed54574a-c344-4f93-bd94-7472a0deb022-kube-api-access-9g8sl\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950155 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5vt2\" (UniqueName: \"kubernetes.io/projected/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-kube-api-access-h5vt2\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950164 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed54574a-c344-4f93-bd94-7472a0deb022-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950173 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950181 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950189 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed54574a-c344-4f93-bd94-7472a0deb022-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:31 crc kubenswrapper[4956]: I0314 09:02:31.950197 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f65f3f5-6345-4702-9c65-3c1fe42a36f6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.684657 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776" Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.685005 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" event={"ID":"ed54574a-c344-4f93-bd94-7472a0deb022","Type":"ContainerDied","Data":"fb62c70153e9c71a32f701d6d8e97d435c317d111f5f8edaa76397207dec6d65"} Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.685093 4956 scope.go:117] "RemoveContainer" containerID="3d90cb11021dcad0c8317d73768479a5e31276abcc9d4859fd0be78315698569" Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.686545 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-869d66459b-c8w74" Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.714629 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776"] Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.718844 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b7cd5c44b-ml776"] Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.726817 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-869d66459b-c8w74"] Mar 14 09:02:32 crc kubenswrapper[4956]: I0314 09:02:32.726916 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-869d66459b-c8w74"] Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.091715 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-5kv5k"] Mar 14 09:02:33 crc kubenswrapper[4956]: E0314 09:02:33.091959 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54574a-c344-4f93-bd94-7472a0deb022" containerName="controller-manager" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.091975 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54574a-c344-4f93-bd94-7472a0deb022" containerName="controller-manager" Mar 14 09:02:33 crc kubenswrapper[4956]: E0314 09:02:33.091999 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f65f3f5-6345-4702-9c65-3c1fe42a36f6" containerName="route-controller-manager" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.092007 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f65f3f5-6345-4702-9c65-3c1fe42a36f6" containerName="route-controller-manager" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.092127 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed54574a-c344-4f93-bd94-7472a0deb022" containerName="controller-manager" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.092145 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f65f3f5-6345-4702-9c65-3c1fe42a36f6" containerName="route-controller-manager" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.092595 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.096309 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.098135 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.098176 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.098206 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.098391 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.100150 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.103999 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg"] Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.104538 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.109391 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.111873 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.116388 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.118341 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.118588 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.118878 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.119072 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.124721 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-5kv5k"] Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.135390 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg"] Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.215549 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f65f3f5-6345-4702-9c65-3c1fe42a36f6" path="/var/lib/kubelet/pods/3f65f3f5-6345-4702-9c65-3c1fe42a36f6/volumes" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.216063 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed54574a-c344-4f93-bd94-7472a0deb022" path="/var/lib/kubelet/pods/ed54574a-c344-4f93-bd94-7472a0deb022/volumes" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.265875 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-serving-cert\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266133 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-serving-cert\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266251 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrqg\" (UniqueName: \"kubernetes.io/projected/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-kube-api-access-mcrqg\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266377 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-config\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwg4t\" (UniqueName: \"kubernetes.io/projected/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-kube-api-access-qwg4t\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266634 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-proxy-ca-bundles\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266740 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-config\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.266907 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-client-ca\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.267015 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-client-ca\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.368446 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrqg\" (UniqueName: \"kubernetes.io/projected/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-kube-api-access-mcrqg\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.369140 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-config\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwg4t\" (UniqueName: \"kubernetes.io/projected/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-kube-api-access-qwg4t\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370446 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-proxy-ca-bundles\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370542 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-config\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370631 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-client-ca\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370717 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-client-ca\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370830 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-serving-cert\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370900 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-serving-cert\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.372769 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-client-ca\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.370312 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-config\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.373541 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-config\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.373655 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-proxy-ca-bundles\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.373877 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-client-ca\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.376621 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-serving-cert\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.385126 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-serving-cert\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.399783 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrqg\" (UniqueName: \"kubernetes.io/projected/a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be-kube-api-access-mcrqg\") pod \"route-controller-manager-dfb87fb89-5x8mg\" (UID: \"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be\") " pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.403352 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwg4t\" (UniqueName: \"kubernetes.io/projected/7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca-kube-api-access-qwg4t\") pod \"controller-manager-84f99cfc54-5kv5k\" (UID: \"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca\") " pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.425444 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.441752 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:33 crc kubenswrapper[4956]: W0314 09:02:33.837899 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c55e27_5f7d_42f8_b2a8_4eac60c4c7be.slice/crio-a7af69a22ea8152806987ca9c0d224760054b501b37327c01e3235ade5faa216 WatchSource:0}: Error finding container a7af69a22ea8152806987ca9c0d224760054b501b37327c01e3235ade5faa216: Status 404 returned error can't find the container with id a7af69a22ea8152806987ca9c0d224760054b501b37327c01e3235ade5faa216 Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.839204 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg"] Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.877803 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84f99cfc54-5kv5k"] Mar 14 09:02:33 crc kubenswrapper[4956]: I0314 09:02:33.971764 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-s8rkh" Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.058575 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9s4fs"] Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.697887 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" event={"ID":"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be","Type":"ContainerStarted","Data":"d9da73d94e07e2669c976ab36b6ac7421c665305d04c315d1daf5257955ea1bb"} Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.698166 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" event={"ID":"a2c55e27-5f7d-42f8-b2a8-4eac60c4c7be","Type":"ContainerStarted","Data":"a7af69a22ea8152806987ca9c0d224760054b501b37327c01e3235ade5faa216"} Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.698183 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.699541 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" event={"ID":"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca","Type":"ContainerStarted","Data":"91fe07c8674fdc22d96afd39b193708c481d5cb655466017201050b93694e2dc"} Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.699576 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" event={"ID":"7eb13c2f-e8ec-4f7c-8b17-e6d425c7c7ca","Type":"ContainerStarted","Data":"e1e72e8d725fce39e8605ccb48716c5c4949e60991995b0da0d9698aafe6275f"} Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.700038 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.702984 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.705517 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.715931 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dfb87fb89-5x8mg" podStartSLOduration=3.715918357 podStartE2EDuration="3.715918357s" podCreationTimestamp="2026-03-14 09:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:34.714858689 +0000 UTC m=+360.227550957" watchObservedRunningTime="2026-03-14 09:02:34.715918357 +0000 UTC m=+360.228610625" Mar 14 09:02:34 crc kubenswrapper[4956]: I0314 09:02:34.755876 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84f99cfc54-5kv5k" podStartSLOduration=3.755856869 podStartE2EDuration="3.755856869s" podCreationTimestamp="2026-03-14 09:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:34.729271285 +0000 UTC m=+360.241963553" watchObservedRunningTime="2026-03-14 09:02:34.755856869 +0000 UTC m=+360.268549137" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.335357 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2vtm"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.335944 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j2vtm" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="registry-server" containerID="cri-o://c9fa095b85996840c323172eeb91450b73eb689debc81d59a28df6655744e781" gracePeriod=30 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.345155 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xw98"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.345472 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8xw98" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="registry-server" containerID="cri-o://2022c86d8eb04405509c2f11141a901012b463aba8b34a9e4e8365477e8f6112" gracePeriod=30 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.367206 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hj4h"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.367508 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerName="marketplace-operator" containerID="cri-o://e8ef7959509dcd8e16bf0fbeafc42f319f009434e5944d4e5b80dca77cee3fce" gracePeriod=30 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.382957 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4ztd"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.383274 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f4ztd" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="registry-server" containerID="cri-o://08115a7096065a8cb729b565a5b2929affd72c301c045bd48274eac6230d07a8" gracePeriod=30 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.387395 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6s"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.387615 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2x6s" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="registry-server" containerID="cri-o://e48dd7440dc630c42703dac0e9aa62de2970a6e1a7275f3cfffd29dc2410bd47" gracePeriod=30 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.390991 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gnpp"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.392120 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.402257 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gnpp"] Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.502575 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92ce0359-87bc-46d6-8673-b10febbf0742-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.502646 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhs7\" (UniqueName: \"kubernetes.io/projected/92ce0359-87bc-46d6-8673-b10febbf0742-kube-api-access-dkhs7\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.502674 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92ce0359-87bc-46d6-8673-b10febbf0742-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.603886 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92ce0359-87bc-46d6-8673-b10febbf0742-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.604162 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhs7\" (UniqueName: \"kubernetes.io/projected/92ce0359-87bc-46d6-8673-b10febbf0742-kube-api-access-dkhs7\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.604190 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92ce0359-87bc-46d6-8673-b10febbf0742-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.605262 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92ce0359-87bc-46d6-8673-b10febbf0742-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.611667 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92ce0359-87bc-46d6-8673-b10febbf0742-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.620848 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhs7\" (UniqueName: \"kubernetes.io/projected/92ce0359-87bc-46d6-8673-b10febbf0742-kube-api-access-dkhs7\") pod \"marketplace-operator-79b997595-7gnpp\" (UID: \"92ce0359-87bc-46d6-8673-b10febbf0742\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.713383 4956 generic.go:334] "Generic (PLEG): container finished" podID="78b22cf8-2118-463f-804e-9890feee4427" containerID="08115a7096065a8cb729b565a5b2929affd72c301c045bd48274eac6230d07a8" exitCode=0 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.713450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4ztd" event={"ID":"78b22cf8-2118-463f-804e-9890feee4427","Type":"ContainerDied","Data":"08115a7096065a8cb729b565a5b2929affd72c301c045bd48274eac6230d07a8"} Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.715221 4956 generic.go:334] "Generic (PLEG): container finished" podID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerID="e8ef7959509dcd8e16bf0fbeafc42f319f009434e5944d4e5b80dca77cee3fce" exitCode=0 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.715262 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" event={"ID":"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e","Type":"ContainerDied","Data":"e8ef7959509dcd8e16bf0fbeafc42f319f009434e5944d4e5b80dca77cee3fce"} Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.736967 4956 generic.go:334] "Generic (PLEG): container finished" podID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerID="c9fa095b85996840c323172eeb91450b73eb689debc81d59a28df6655744e781" exitCode=0 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.737051 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2vtm" event={"ID":"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3","Type":"ContainerDied","Data":"c9fa095b85996840c323172eeb91450b73eb689debc81d59a28df6655744e781"} Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.742651 4956 generic.go:334] "Generic (PLEG): container finished" podID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerID="e48dd7440dc630c42703dac0e9aa62de2970a6e1a7275f3cfffd29dc2410bd47" exitCode=0 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.742726 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerDied","Data":"e48dd7440dc630c42703dac0e9aa62de2970a6e1a7275f3cfffd29dc2410bd47"} Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.746180 4956 generic.go:334] "Generic (PLEG): container finished" podID="6a3b7192-2792-4295-b25d-a22c476cd174" containerID="2022c86d8eb04405509c2f11141a901012b463aba8b34a9e4e8365477e8f6112" exitCode=0 Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.746244 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xw98" event={"ID":"6a3b7192-2792-4295-b25d-a22c476cd174","Type":"ContainerDied","Data":"2022c86d8eb04405509c2f11141a901012b463aba8b34a9e4e8365477e8f6112"} Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.773540 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.784793 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.876743 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.907605 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-catalog-content\") pod \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.907690 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-utilities\") pod \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.907740 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpwz5\" (UniqueName: \"kubernetes.io/projected/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-kube-api-access-vpwz5\") pod \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\" (UID: \"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3\") " Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.908711 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-utilities" (OuterVolumeSpecName: "utilities") pod "7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" (UID: "7f2f9b72-16af-43fb-9687-fe7cbbc51bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.913575 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-kube-api-access-vpwz5" (OuterVolumeSpecName: "kube-api-access-vpwz5") pod "7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" (UID: "7f2f9b72-16af-43fb-9687-fe7cbbc51bb3"). InnerVolumeSpecName "kube-api-access-vpwz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.923087 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.934910 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.939563 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xw98" Mar 14 09:02:35 crc kubenswrapper[4956]: I0314 09:02:35.976771 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" (UID: "7f2f9b72-16af-43fb-9687-fe7cbbc51bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.009903 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xsff\" (UniqueName: \"kubernetes.io/projected/2667e495-4a15-4aa2-8839-e1b66f2ee380-kube-api-access-2xsff\") pod \"2667e495-4a15-4aa2-8839-e1b66f2ee380\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.009970 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-catalog-content\") pod \"78b22cf8-2118-463f-804e-9890feee4427\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.010025 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-utilities\") pod \"78b22cf8-2118-463f-804e-9890feee4427\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.010877 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-utilities\") pod \"2667e495-4a15-4aa2-8839-e1b66f2ee380\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.010905 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-catalog-content\") pod \"2667e495-4a15-4aa2-8839-e1b66f2ee380\" (UID: \"2667e495-4a15-4aa2-8839-e1b66f2ee380\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.010933 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7rkl\" (UniqueName: \"kubernetes.io/projected/78b22cf8-2118-463f-804e-9890feee4427-kube-api-access-r7rkl\") pod \"78b22cf8-2118-463f-804e-9890feee4427\" (UID: \"78b22cf8-2118-463f-804e-9890feee4427\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.011174 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.011192 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpwz5\" (UniqueName: \"kubernetes.io/projected/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-kube-api-access-vpwz5\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.011203 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.011786 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-utilities" (OuterVolumeSpecName: "utilities") pod "2667e495-4a15-4aa2-8839-e1b66f2ee380" (UID: "2667e495-4a15-4aa2-8839-e1b66f2ee380"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.012788 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-utilities" (OuterVolumeSpecName: "utilities") pod "78b22cf8-2118-463f-804e-9890feee4427" (UID: "78b22cf8-2118-463f-804e-9890feee4427"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.012929 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2667e495-4a15-4aa2-8839-e1b66f2ee380-kube-api-access-2xsff" (OuterVolumeSpecName: "kube-api-access-2xsff") pod "2667e495-4a15-4aa2-8839-e1b66f2ee380" (UID: "2667e495-4a15-4aa2-8839-e1b66f2ee380"). InnerVolumeSpecName "kube-api-access-2xsff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.013626 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b22cf8-2118-463f-804e-9890feee4427-kube-api-access-r7rkl" (OuterVolumeSpecName: "kube-api-access-r7rkl") pod "78b22cf8-2118-463f-804e-9890feee4427" (UID: "78b22cf8-2118-463f-804e-9890feee4427"). InnerVolumeSpecName "kube-api-access-r7rkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.041727 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78b22cf8-2118-463f-804e-9890feee4427" (UID: "78b22cf8-2118-463f-804e-9890feee4427"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.111939 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xxwh\" (UniqueName: \"kubernetes.io/projected/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-kube-api-access-9xxwh\") pod \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.111990 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-operator-metrics\") pod \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112025 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-utilities\") pod \"6a3b7192-2792-4295-b25d-a22c476cd174\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112079 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb4vj\" (UniqueName: \"kubernetes.io/projected/6a3b7192-2792-4295-b25d-a22c476cd174-kube-api-access-rb4vj\") pod \"6a3b7192-2792-4295-b25d-a22c476cd174\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112106 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-trusted-ca\") pod \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\" (UID: \"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112129 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-catalog-content\") pod \"6a3b7192-2792-4295-b25d-a22c476cd174\" (UID: \"6a3b7192-2792-4295-b25d-a22c476cd174\") " Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112339 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112350 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7rkl\" (UniqueName: \"kubernetes.io/projected/78b22cf8-2118-463f-804e-9890feee4427-kube-api-access-r7rkl\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112360 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xsff\" (UniqueName: \"kubernetes.io/projected/2667e495-4a15-4aa2-8839-e1b66f2ee380-kube-api-access-2xsff\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112368 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.112376 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b22cf8-2118-463f-804e-9890feee4427-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.113584 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" (UID: "0eb00c56-ab05-4c9d-a1d4-a80a53775d4e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.115394 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-utilities" (OuterVolumeSpecName: "utilities") pod "6a3b7192-2792-4295-b25d-a22c476cd174" (UID: "6a3b7192-2792-4295-b25d-a22c476cd174"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.116327 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3b7192-2792-4295-b25d-a22c476cd174-kube-api-access-rb4vj" (OuterVolumeSpecName: "kube-api-access-rb4vj") pod "6a3b7192-2792-4295-b25d-a22c476cd174" (UID: "6a3b7192-2792-4295-b25d-a22c476cd174"). InnerVolumeSpecName "kube-api-access-rb4vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.116387 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" (UID: "0eb00c56-ab05-4c9d-a1d4-a80a53775d4e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.118032 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-kube-api-access-9xxwh" (OuterVolumeSpecName: "kube-api-access-9xxwh") pod "0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" (UID: "0eb00c56-ab05-4c9d-a1d4-a80a53775d4e"). InnerVolumeSpecName "kube-api-access-9xxwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.140720 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2667e495-4a15-4aa2-8839-e1b66f2ee380" (UID: "2667e495-4a15-4aa2-8839-e1b66f2ee380"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.157623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a3b7192-2792-4295-b25d-a22c476cd174" (UID: "6a3b7192-2792-4295-b25d-a22c476cd174"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.197299 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gnpp"] Mar 14 09:02:36 crc kubenswrapper[4956]: W0314 09:02:36.202367 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ce0359_87bc_46d6_8673_b10febbf0742.slice/crio-9a0d591138a86b293197f91a3b541b4d1e5e88f7477b3e2d18213f86e44bca6c WatchSource:0}: Error finding container 9a0d591138a86b293197f91a3b541b4d1e5e88f7477b3e2d18213f86e44bca6c: Status 404 returned error can't find the container with id 9a0d591138a86b293197f91a3b541b4d1e5e88f7477b3e2d18213f86e44bca6c Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213865 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2667e495-4a15-4aa2-8839-e1b66f2ee380-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213886 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xxwh\" (UniqueName: \"kubernetes.io/projected/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-kube-api-access-9xxwh\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213898 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213909 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213918 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb4vj\" (UniqueName: \"kubernetes.io/projected/6a3b7192-2792-4295-b25d-a22c476cd174-kube-api-access-rb4vj\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213926 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.213933 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b7192-2792-4295-b25d-a22c476cd174-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.753290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" event={"ID":"92ce0359-87bc-46d6-8673-b10febbf0742","Type":"ContainerStarted","Data":"ce55d813b45a066f7d8a4fcfe0b60c6e20b81924e9f727ff4c44748f0963eca2"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.753401 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" event={"ID":"92ce0359-87bc-46d6-8673-b10febbf0742","Type":"ContainerStarted","Data":"9a0d591138a86b293197f91a3b541b4d1e5e88f7477b3e2d18213f86e44bca6c"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.753468 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.756405 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2vtm" event={"ID":"7f2f9b72-16af-43fb-9687-fe7cbbc51bb3","Type":"ContainerDied","Data":"935c6f59c403bcddfc63f94f3761e3deaa2ef9567721c1164f88c2a1330e17f7"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.756449 4956 scope.go:117] "RemoveContainer" containerID="c9fa095b85996840c323172eeb91450b73eb689debc81d59a28df6655744e781" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.756553 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2vtm" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.757955 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.760678 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2x6s" event={"ID":"2667e495-4a15-4aa2-8839-e1b66f2ee380","Type":"ContainerDied","Data":"0a4691427777bb8e20835891c990ab11cc0dd4b2e6d16be845235f4c0c0ff692"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.760773 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2x6s" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.765992 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4ztd" event={"ID":"78b22cf8-2118-463f-804e-9890feee4427","Type":"ContainerDied","Data":"a512f833a9ece83238ad251f8b12089f58b800a74b9d08f510d3c3d8c6a1d4c5"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.766133 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4ztd" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.768059 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" event={"ID":"0eb00c56-ab05-4c9d-a1d4-a80a53775d4e","Type":"ContainerDied","Data":"72a93aa6750359f1b00e770e04c43c8e02976f6c78aec2db408f30f0a67db7a4"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.768154 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2hj4h" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.771383 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xw98" event={"ID":"6a3b7192-2792-4295-b25d-a22c476cd174","Type":"ContainerDied","Data":"5364f0d2a0b56f993c67694feeea43c64ad9de9b6ae0347dd10211fad9586d54"} Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.771498 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xw98" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.790708 4956 scope.go:117] "RemoveContainer" containerID="2cb746ea276a53ffe403d6c8584658e2c4ea68a0ec683328239e6a79cfd6863b" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.799143 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7gnpp" podStartSLOduration=1.799121651 podStartE2EDuration="1.799121651s" podCreationTimestamp="2026-03-14 09:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:36.781711546 +0000 UTC m=+362.294403814" watchObservedRunningTime="2026-03-14 09:02:36.799121651 +0000 UTC m=+362.311813939" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.799701 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2vtm"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.804039 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j2vtm"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.816127 4956 scope.go:117] "RemoveContainer" containerID="534dc332cf3213c3378b40a337a3556a617a27764edd854462063130eba499ca" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.819704 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6s"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.823879 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2x6s"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.853711 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xw98"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.861173 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8xw98"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.868630 4956 scope.go:117] "RemoveContainer" containerID="e48dd7440dc630c42703dac0e9aa62de2970a6e1a7275f3cfffd29dc2410bd47" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.872908 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hj4h"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.878210 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2hj4h"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.888864 4956 scope.go:117] "RemoveContainer" containerID="57ebc0805c02a5b7a5111cbc5b97b015673bdac005be912f580df0488df2cc63" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.891297 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4ztd"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.894496 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4ztd"] Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.909715 4956 scope.go:117] "RemoveContainer" containerID="674ae2c1339c7b26e1e8d9d05f922112191c27a8b7533e53848f89b2e2196c94" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.924368 4956 scope.go:117] "RemoveContainer" containerID="08115a7096065a8cb729b565a5b2929affd72c301c045bd48274eac6230d07a8" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.940508 4956 scope.go:117] "RemoveContainer" containerID="521651c5c3411b55a2394f29664b79a71bbb893c6cc9d66cb2240c0cb72df7cc" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.953201 4956 scope.go:117] "RemoveContainer" containerID="f5cff739814123d5bbda760d0816ca4510d4a8554ce2d646eec4c24af9f8a933" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.963934 4956 scope.go:117] "RemoveContainer" containerID="e8ef7959509dcd8e16bf0fbeafc42f319f009434e5944d4e5b80dca77cee3fce" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.976032 4956 scope.go:117] "RemoveContainer" containerID="2022c86d8eb04405509c2f11141a901012b463aba8b34a9e4e8365477e8f6112" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.986155 4956 scope.go:117] "RemoveContainer" containerID="74cab11720b09ee8e6e9cf5424bd24f1cc1ba5a3fd23a7edd990291aa8a9372b" Mar 14 09:02:36 crc kubenswrapper[4956]: I0314 09:02:36.996639 4956 scope.go:117] "RemoveContainer" containerID="49dc3da3d6778d72cad65f189f9bd13ac251fc0f5f341eab57b33345fe2afd93" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.215283 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" path="/var/lib/kubelet/pods/0eb00c56-ab05-4c9d-a1d4-a80a53775d4e/volumes" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.215903 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" path="/var/lib/kubelet/pods/2667e495-4a15-4aa2-8839-e1b66f2ee380/volumes" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.216579 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" path="/var/lib/kubelet/pods/6a3b7192-2792-4295-b25d-a22c476cd174/volumes" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.217547 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b22cf8-2118-463f-804e-9890feee4427" path="/var/lib/kubelet/pods/78b22cf8-2118-463f-804e-9890feee4427/volumes" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.218095 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" path="/var/lib/kubelet/pods/7f2f9b72-16af-43fb-9687-fe7cbbc51bb3/volumes" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547294 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6mtml"] Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547539 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547555 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547572 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547579 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547591 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547599 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547610 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547618 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547628 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547635 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547645 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547653 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547667 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547677 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547686 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547694 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="extract-utilities" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547704 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547711 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547722 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547729 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547742 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547749 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547758 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547765 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="extract-content" Mar 14 09:02:37 crc kubenswrapper[4956]: E0314 09:02:37.547774 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerName="marketplace-operator" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547781 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerName="marketplace-operator" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547886 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b22cf8-2118-463f-804e-9890feee4427" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547898 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb00c56-ab05-4c9d-a1d4-a80a53775d4e" containerName="marketplace-operator" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547913 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2f9b72-16af-43fb-9687-fe7cbbc51bb3" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547923 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2667e495-4a15-4aa2-8839-e1b66f2ee380" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.547932 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3b7192-2792-4295-b25d-a22c476cd174" containerName="registry-server" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.548795 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.550795 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.554035 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mtml"] Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.635382 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098e20c5-7d99-4468-896d-fb13222a450b-utilities\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.635454 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29vk\" (UniqueName: \"kubernetes.io/projected/098e20c5-7d99-4468-896d-fb13222a450b-kube-api-access-q29vk\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.635616 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098e20c5-7d99-4468-896d-fb13222a450b-catalog-content\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.736752 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098e20c5-7d99-4468-896d-fb13222a450b-utilities\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.736829 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q29vk\" (UniqueName: \"kubernetes.io/projected/098e20c5-7d99-4468-896d-fb13222a450b-kube-api-access-q29vk\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.736866 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098e20c5-7d99-4468-896d-fb13222a450b-catalog-content\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.737252 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098e20c5-7d99-4468-896d-fb13222a450b-utilities\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.737276 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098e20c5-7d99-4468-896d-fb13222a450b-catalog-content\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.752028 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5d92s"] Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.752987 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.755571 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.759174 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29vk\" (UniqueName: \"kubernetes.io/projected/098e20c5-7d99-4468-896d-fb13222a450b-kube-api-access-q29vk\") pod \"redhat-marketplace-6mtml\" (UID: \"098e20c5-7d99-4468-896d-fb13222a450b\") " pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.766170 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5d92s"] Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.838387 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7s9\" (UniqueName: \"kubernetes.io/projected/e7f24159-2581-4653-be69-3aae1dd7e3f9-kube-api-access-hf7s9\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.838465 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7f24159-2581-4653-be69-3aae1dd7e3f9-utilities\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.838514 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7f24159-2581-4653-be69-3aae1dd7e3f9-catalog-content\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.867432 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.939126 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7s9\" (UniqueName: \"kubernetes.io/projected/e7f24159-2581-4653-be69-3aae1dd7e3f9-kube-api-access-hf7s9\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.939303 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7f24159-2581-4653-be69-3aae1dd7e3f9-utilities\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.939342 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7f24159-2581-4653-be69-3aae1dd7e3f9-catalog-content\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.939828 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7f24159-2581-4653-be69-3aae1dd7e3f9-catalog-content\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.940523 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7f24159-2581-4653-be69-3aae1dd7e3f9-utilities\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:37 crc kubenswrapper[4956]: I0314 09:02:37.960145 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7s9\" (UniqueName: \"kubernetes.io/projected/e7f24159-2581-4653-be69-3aae1dd7e3f9-kube-api-access-hf7s9\") pod \"redhat-operators-5d92s\" (UID: \"e7f24159-2581-4653-be69-3aae1dd7e3f9\") " pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:38 crc kubenswrapper[4956]: I0314 09:02:38.095208 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:38 crc kubenswrapper[4956]: I0314 09:02:38.513295 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mtml"] Mar 14 09:02:38 crc kubenswrapper[4956]: I0314 09:02:38.809504 4956 generic.go:334] "Generic (PLEG): container finished" podID="098e20c5-7d99-4468-896d-fb13222a450b" containerID="1e226276734455b5c334e2c955374230bcc3dbef0c66d23a11dc9ac16b271ae7" exitCode=0 Mar 14 09:02:38 crc kubenswrapper[4956]: I0314 09:02:38.809705 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mtml" event={"ID":"098e20c5-7d99-4468-896d-fb13222a450b","Type":"ContainerDied","Data":"1e226276734455b5c334e2c955374230bcc3dbef0c66d23a11dc9ac16b271ae7"} Mar 14 09:02:38 crc kubenswrapper[4956]: I0314 09:02:38.809776 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mtml" event={"ID":"098e20c5-7d99-4468-896d-fb13222a450b","Type":"ContainerStarted","Data":"c6ccb81d0d5c52affa4fd2f67a1e0e9ba778a26a4b6a396ee26bad875801fbd7"} Mar 14 09:02:38 crc kubenswrapper[4956]: I0314 09:02:38.824631 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5d92s"] Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.815736 4956 generic.go:334] "Generic (PLEG): container finished" podID="e7f24159-2581-4653-be69-3aae1dd7e3f9" containerID="faf458c0fcb89e8c88f4c49d6e838391080e07102560202dc408a99fe7170ced" exitCode=0 Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.815853 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d92s" event={"ID":"e7f24159-2581-4653-be69-3aae1dd7e3f9","Type":"ContainerDied","Data":"faf458c0fcb89e8c88f4c49d6e838391080e07102560202dc408a99fe7170ced"} Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.816020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d92s" event={"ID":"e7f24159-2581-4653-be69-3aae1dd7e3f9","Type":"ContainerStarted","Data":"c3f1cc0e028184439e7709fb9fc5b4b7f1bbc93407e100374f9116bad2565740"} Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.952796 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzmj4"] Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.953867 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.955874 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 09:02:39 crc kubenswrapper[4956]: I0314 09:02:39.959278 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzmj4"] Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.069918 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtn4\" (UniqueName: \"kubernetes.io/projected/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-kube-api-access-5dtn4\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.069960 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-utilities\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.069983 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-catalog-content\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.147696 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fnwsk"] Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.149092 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.152800 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.156963 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fnwsk"] Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.171093 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtn4\" (UniqueName: \"kubernetes.io/projected/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-kube-api-access-5dtn4\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.171138 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-utilities\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.171163 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-catalog-content\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.171597 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-utilities\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.171628 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-catalog-content\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.195135 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtn4\" (UniqueName: \"kubernetes.io/projected/7e5025eb-1d48-4be9-ab21-5aa5c0436d7b-kube-api-access-5dtn4\") pod \"certified-operators-vzmj4\" (UID: \"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b\") " pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.271662 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.271932 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1f1fb-17bc-4cc7-a158-6576b474e996-utilities\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.271981 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7c27\" (UniqueName: \"kubernetes.io/projected/0eb1f1fb-17bc-4cc7-a158-6576b474e996-kube-api-access-w7c27\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.272032 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1f1fb-17bc-4cc7-a158-6576b474e996-catalog-content\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.373514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1f1fb-17bc-4cc7-a158-6576b474e996-utilities\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.373898 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7c27\" (UniqueName: \"kubernetes.io/projected/0eb1f1fb-17bc-4cc7-a158-6576b474e996-kube-api-access-w7c27\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.373946 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1f1fb-17bc-4cc7-a158-6576b474e996-catalog-content\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.374290 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eb1f1fb-17bc-4cc7-a158-6576b474e996-catalog-content\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.374299 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eb1f1fb-17bc-4cc7-a158-6576b474e996-utilities\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.411205 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7c27\" (UniqueName: \"kubernetes.io/projected/0eb1f1fb-17bc-4cc7-a158-6576b474e996-kube-api-access-w7c27\") pod \"community-operators-fnwsk\" (UID: \"0eb1f1fb-17bc-4cc7-a158-6576b474e996\") " pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:40 crc kubenswrapper[4956]: I0314 09:02:40.493055 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:40.710608 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzmj4"] Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:40.825200 4956 generic.go:334] "Generic (PLEG): container finished" podID="098e20c5-7d99-4468-896d-fb13222a450b" containerID="df179f99128f22d883a660a70e281ae19c5113de9eee0881107d1db4e3e0868d" exitCode=0 Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:40.825298 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mtml" event={"ID":"098e20c5-7d99-4468-896d-fb13222a450b","Type":"ContainerDied","Data":"df179f99128f22d883a660a70e281ae19c5113de9eee0881107d1db4e3e0868d"} Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:40.828815 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzmj4" event={"ID":"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b","Type":"ContainerStarted","Data":"ccb9e7cf30221722d7799f5d2137fec7754d5cd8610eff1a634aca2cc6eae1f1"} Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:41.836431 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e5025eb-1d48-4be9-ab21-5aa5c0436d7b" containerID="c4a29ef7b98bf6ab27851be220d62f511e0b9eb50eedeb1c676a459ee9584630" exitCode=0 Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:41.836532 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzmj4" event={"ID":"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b","Type":"ContainerDied","Data":"c4a29ef7b98bf6ab27851be220d62f511e0b9eb50eedeb1c676a459ee9584630"} Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:41.842844 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d92s" event={"ID":"e7f24159-2581-4653-be69-3aae1dd7e3f9","Type":"ContainerStarted","Data":"d18c25e3c6ec210650af150d9e5aae9e5c9cb24fc50450e5965493f1261cfafb"} Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:41.848769 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mtml" event={"ID":"098e20c5-7d99-4468-896d-fb13222a450b","Type":"ContainerStarted","Data":"d8587b87356dbd966af385451843f0ad075f522187dbecba70f612bb97208dbb"} Mar 14 09:02:41 crc kubenswrapper[4956]: I0314 09:02:41.879944 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6mtml" podStartSLOduration=2.186903831 podStartE2EDuration="4.879921844s" podCreationTimestamp="2026-03-14 09:02:37 +0000 UTC" firstStartedPulling="2026-03-14 09:02:38.81319181 +0000 UTC m=+364.325884078" lastFinishedPulling="2026-03-14 09:02:41.506209823 +0000 UTC m=+367.018902091" observedRunningTime="2026-03-14 09:02:41.875985451 +0000 UTC m=+367.388677749" watchObservedRunningTime="2026-03-14 09:02:41.879921844 +0000 UTC m=+367.392614132" Mar 14 09:02:42 crc kubenswrapper[4956]: I0314 09:02:42.088304 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fnwsk"] Mar 14 09:02:42 crc kubenswrapper[4956]: W0314 09:02:42.143100 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb1f1fb_17bc_4cc7_a158_6576b474e996.slice/crio-9b0844347634b292bab6deab780a00cff47894d62d10be6148a073a7b0f92392 WatchSource:0}: Error finding container 9b0844347634b292bab6deab780a00cff47894d62d10be6148a073a7b0f92392: Status 404 returned error can't find the container with id 9b0844347634b292bab6deab780a00cff47894d62d10be6148a073a7b0f92392 Mar 14 09:02:42 crc kubenswrapper[4956]: I0314 09:02:42.854248 4956 generic.go:334] "Generic (PLEG): container finished" podID="0eb1f1fb-17bc-4cc7-a158-6576b474e996" containerID="3b910765cf65b080445cf9eaa20a3edd142c84b0b68defff89b48d7dfe306bd1" exitCode=0 Mar 14 09:02:42 crc kubenswrapper[4956]: I0314 09:02:42.854920 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwsk" event={"ID":"0eb1f1fb-17bc-4cc7-a158-6576b474e996","Type":"ContainerDied","Data":"3b910765cf65b080445cf9eaa20a3edd142c84b0b68defff89b48d7dfe306bd1"} Mar 14 09:02:42 crc kubenswrapper[4956]: I0314 09:02:42.854941 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwsk" event={"ID":"0eb1f1fb-17bc-4cc7-a158-6576b474e996","Type":"ContainerStarted","Data":"9b0844347634b292bab6deab780a00cff47894d62d10be6148a073a7b0f92392"} Mar 14 09:02:42 crc kubenswrapper[4956]: I0314 09:02:42.856991 4956 generic.go:334] "Generic (PLEG): container finished" podID="e7f24159-2581-4653-be69-3aae1dd7e3f9" containerID="d18c25e3c6ec210650af150d9e5aae9e5c9cb24fc50450e5965493f1261cfafb" exitCode=0 Mar 14 09:02:42 crc kubenswrapper[4956]: I0314 09:02:42.857794 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d92s" event={"ID":"e7f24159-2581-4653-be69-3aae1dd7e3f9","Type":"ContainerDied","Data":"d18c25e3c6ec210650af150d9e5aae9e5c9cb24fc50450e5965493f1261cfafb"} Mar 14 09:02:43 crc kubenswrapper[4956]: I0314 09:02:43.864527 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e5025eb-1d48-4be9-ab21-5aa5c0436d7b" containerID="d9072a497d8e381275b9433d84c3d11f32eaa2b7b0cbad235dd7d14143a8f324" exitCode=0 Mar 14 09:02:43 crc kubenswrapper[4956]: I0314 09:02:43.864582 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzmj4" event={"ID":"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b","Type":"ContainerDied","Data":"d9072a497d8e381275b9433d84c3d11f32eaa2b7b0cbad235dd7d14143a8f324"} Mar 14 09:02:43 crc kubenswrapper[4956]: I0314 09:02:43.869525 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d92s" event={"ID":"e7f24159-2581-4653-be69-3aae1dd7e3f9","Type":"ContainerStarted","Data":"c91864cbf7f45e1a8afc887b60552e7c5389e811a00597aa07ed82737fe8e53f"} Mar 14 09:02:44 crc kubenswrapper[4956]: I0314 09:02:44.879616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzmj4" event={"ID":"7e5025eb-1d48-4be9-ab21-5aa5c0436d7b","Type":"ContainerStarted","Data":"eb3f9feae56419206e9a33cd93abd52c8a72e3ee91c4a1dc531037b8c27ea8ee"} Mar 14 09:02:44 crc kubenswrapper[4956]: I0314 09:02:44.881152 4956 generic.go:334] "Generic (PLEG): container finished" podID="0eb1f1fb-17bc-4cc7-a158-6576b474e996" containerID="e8a71351f8546779f4a7354b89d7199ea701e7be002c1e720cf2bebc8cdf22f0" exitCode=0 Mar 14 09:02:44 crc kubenswrapper[4956]: I0314 09:02:44.881202 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwsk" event={"ID":"0eb1f1fb-17bc-4cc7-a158-6576b474e996","Type":"ContainerDied","Data":"e8a71351f8546779f4a7354b89d7199ea701e7be002c1e720cf2bebc8cdf22f0"} Mar 14 09:02:44 crc kubenswrapper[4956]: I0314 09:02:44.906280 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzmj4" podStartSLOduration=3.438046627 podStartE2EDuration="5.906260315s" podCreationTimestamp="2026-03-14 09:02:39 +0000 UTC" firstStartedPulling="2026-03-14 09:02:41.838395581 +0000 UTC m=+367.351087849" lastFinishedPulling="2026-03-14 09:02:44.306609269 +0000 UTC m=+369.819301537" observedRunningTime="2026-03-14 09:02:44.904794286 +0000 UTC m=+370.417486574" watchObservedRunningTime="2026-03-14 09:02:44.906260315 +0000 UTC m=+370.418952603" Mar 14 09:02:44 crc kubenswrapper[4956]: I0314 09:02:44.927679 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5d92s" podStartSLOduration=4.183457403 podStartE2EDuration="7.927663023s" podCreationTimestamp="2026-03-14 09:02:37 +0000 UTC" firstStartedPulling="2026-03-14 09:02:39.817162055 +0000 UTC m=+365.329854343" lastFinishedPulling="2026-03-14 09:02:43.561367695 +0000 UTC m=+369.074059963" observedRunningTime="2026-03-14 09:02:44.926987395 +0000 UTC m=+370.439679683" watchObservedRunningTime="2026-03-14 09:02:44.927663023 +0000 UTC m=+370.440355291" Mar 14 09:02:45 crc kubenswrapper[4956]: I0314 09:02:45.889650 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwsk" event={"ID":"0eb1f1fb-17bc-4cc7-a158-6576b474e996","Type":"ContainerStarted","Data":"491c436fe23157a16fbbde3bc26217fe9c9c090fe92c639f30a4868be14d341a"} Mar 14 09:02:47 crc kubenswrapper[4956]: I0314 09:02:47.868647 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:47 crc kubenswrapper[4956]: I0314 09:02:47.868767 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:47 crc kubenswrapper[4956]: I0314 09:02:47.920070 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:47 crc kubenswrapper[4956]: I0314 09:02:47.936563 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fnwsk" podStartSLOduration=5.428398348 podStartE2EDuration="7.936541148s" podCreationTimestamp="2026-03-14 09:02:40 +0000 UTC" firstStartedPulling="2026-03-14 09:02:42.856056903 +0000 UTC m=+368.368749161" lastFinishedPulling="2026-03-14 09:02:45.364199683 +0000 UTC m=+370.876891961" observedRunningTime="2026-03-14 09:02:45.909145691 +0000 UTC m=+371.421837969" watchObservedRunningTime="2026-03-14 09:02:47.936541148 +0000 UTC m=+373.449233426" Mar 14 09:02:47 crc kubenswrapper[4956]: I0314 09:02:47.964183 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6mtml" Mar 14 09:02:48 crc kubenswrapper[4956]: I0314 09:02:48.096038 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:48 crc kubenswrapper[4956]: I0314 09:02:48.096088 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:49 crc kubenswrapper[4956]: I0314 09:02:49.140023 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5d92s" podUID="e7f24159-2581-4653-be69-3aae1dd7e3f9" containerName="registry-server" probeResult="failure" output=< Mar 14 09:02:49 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:02:49 crc kubenswrapper[4956]: > Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.273341 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.273730 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.344851 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.493212 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.493284 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.553899 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.970626 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fnwsk" Mar 14 09:02:50 crc kubenswrapper[4956]: I0314 09:02:50.987452 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzmj4" Mar 14 09:02:58 crc kubenswrapper[4956]: I0314 09:02:58.139866 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:58 crc kubenswrapper[4956]: I0314 09:02:58.179229 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5d92s" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.094650 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" podUID="b74baec9-353b-4ada-a777-a0cedf80aaf8" containerName="registry" containerID="cri-o://f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b" gracePeriod=30 Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.567199 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654320 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-bound-sa-token\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654454 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b74baec9-353b-4ada-a777-a0cedf80aaf8-installation-pull-secrets\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654509 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-certificates\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654540 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-tls\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654569 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b74baec9-353b-4ada-a777-a0cedf80aaf8-ca-trust-extracted\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654608 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvktv\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-kube-api-access-rvktv\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654644 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-trusted-ca\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.654748 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b74baec9-353b-4ada-a777-a0cedf80aaf8\" (UID: \"b74baec9-353b-4ada-a777-a0cedf80aaf8\") " Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.656038 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.661820 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.672550 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74baec9-353b-4ada-a777-a0cedf80aaf8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.672635 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-kube-api-access-rvktv" (OuterVolumeSpecName: "kube-api-access-rvktv") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "kube-api-access-rvktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.673056 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.673458 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.682164 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74baec9-353b-4ada-a777-a0cedf80aaf8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.686101 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b74baec9-353b-4ada-a777-a0cedf80aaf8" (UID: "b74baec9-353b-4ada-a777-a0cedf80aaf8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756441 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756493 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756506 4956 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b74baec9-353b-4ada-a777-a0cedf80aaf8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756516 4956 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756525 4956 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756532 4956 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b74baec9-353b-4ada-a777-a0cedf80aaf8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.756540 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvktv\" (UniqueName: \"kubernetes.io/projected/b74baec9-353b-4ada-a777-a0cedf80aaf8-kube-api-access-rvktv\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.983602 4956 generic.go:334] "Generic (PLEG): container finished" podID="b74baec9-353b-4ada-a777-a0cedf80aaf8" containerID="f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b" exitCode=0 Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.983694 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" event={"ID":"b74baec9-353b-4ada-a777-a0cedf80aaf8","Type":"ContainerDied","Data":"f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b"} Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.983767 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.984040 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9s4fs" event={"ID":"b74baec9-353b-4ada-a777-a0cedf80aaf8","Type":"ContainerDied","Data":"0dbfc91da8aabb7fe8599a4348c632eebe6811962e6622612fa1e0d7df9323df"} Mar 14 09:02:59 crc kubenswrapper[4956]: I0314 09:02:59.984076 4956 scope.go:117] "RemoveContainer" containerID="f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b" Mar 14 09:03:00 crc kubenswrapper[4956]: I0314 09:03:00.005041 4956 scope.go:117] "RemoveContainer" containerID="f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b" Mar 14 09:03:00 crc kubenswrapper[4956]: E0314 09:03:00.005612 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b\": container with ID starting with f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b not found: ID does not exist" containerID="f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b" Mar 14 09:03:00 crc kubenswrapper[4956]: I0314 09:03:00.005665 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b"} err="failed to get container status \"f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b\": rpc error: code = NotFound desc = could not find container \"f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b\": container with ID starting with f1fd9c795a6416a1686d42521bb46090006d565ac41f646441ab71a6431a508b not found: ID does not exist" Mar 14 09:03:00 crc kubenswrapper[4956]: I0314 09:03:00.033414 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9s4fs"] Mar 14 09:03:00 crc kubenswrapper[4956]: I0314 09:03:00.041349 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9s4fs"] Mar 14 09:03:01 crc kubenswrapper[4956]: I0314 09:03:01.222054 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74baec9-353b-4ada-a777-a0cedf80aaf8" path="/var/lib/kubelet/pods/b74baec9-353b-4ada-a777-a0cedf80aaf8/volumes" Mar 14 09:03:25 crc kubenswrapper[4956]: I0314 09:03:25.423241 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:03:25 crc kubenswrapper[4956]: I0314 09:03:25.424016 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:03:55 crc kubenswrapper[4956]: I0314 09:03:55.423126 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:03:55 crc kubenswrapper[4956]: I0314 09:03:55.423571 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.140605 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557984-z8grt"] Mar 14 09:04:00 crc kubenswrapper[4956]: E0314 09:04:00.141248 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74baec9-353b-4ada-a777-a0cedf80aaf8" containerName="registry" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.141263 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74baec9-353b-4ada-a777-a0cedf80aaf8" containerName="registry" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.141415 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74baec9-353b-4ada-a777-a0cedf80aaf8" containerName="registry" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.141898 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.144382 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.144402 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.144635 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.152959 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-z8grt"] Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.325396 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8znwj\" (UniqueName: \"kubernetes.io/projected/a83e5d5f-ebdb-40bd-bede-617a7edba9e0-kube-api-access-8znwj\") pod \"auto-csr-approver-29557984-z8grt\" (UID: \"a83e5d5f-ebdb-40bd-bede-617a7edba9e0\") " pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.426322 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8znwj\" (UniqueName: \"kubernetes.io/projected/a83e5d5f-ebdb-40bd-bede-617a7edba9e0-kube-api-access-8znwj\") pod \"auto-csr-approver-29557984-z8grt\" (UID: \"a83e5d5f-ebdb-40bd-bede-617a7edba9e0\") " pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.449086 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8znwj\" (UniqueName: \"kubernetes.io/projected/a83e5d5f-ebdb-40bd-bede-617a7edba9e0-kube-api-access-8znwj\") pod \"auto-csr-approver-29557984-z8grt\" (UID: \"a83e5d5f-ebdb-40bd-bede-617a7edba9e0\") " pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.459818 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:00 crc kubenswrapper[4956]: I0314 09:04:00.861430 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-z8grt"] Mar 14 09:04:01 crc kubenswrapper[4956]: I0314 09:04:01.314913 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-z8grt" event={"ID":"a83e5d5f-ebdb-40bd-bede-617a7edba9e0","Type":"ContainerStarted","Data":"0ed819120a17ea230537d614dee5e2bf6a666f4f109fc563646cb9486b0d6b5e"} Mar 14 09:04:02 crc kubenswrapper[4956]: I0314 09:04:02.323002 4956 generic.go:334] "Generic (PLEG): container finished" podID="a83e5d5f-ebdb-40bd-bede-617a7edba9e0" containerID="5b6822127eb6caa549ad5bb7e7435c50a72e3a7ca443ba687b27be5a6325037a" exitCode=0 Mar 14 09:04:02 crc kubenswrapper[4956]: I0314 09:04:02.323116 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-z8grt" event={"ID":"a83e5d5f-ebdb-40bd-bede-617a7edba9e0","Type":"ContainerDied","Data":"5b6822127eb6caa549ad5bb7e7435c50a72e3a7ca443ba687b27be5a6325037a"} Mar 14 09:04:03 crc kubenswrapper[4956]: I0314 09:04:03.579470 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:03 crc kubenswrapper[4956]: I0314 09:04:03.764310 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8znwj\" (UniqueName: \"kubernetes.io/projected/a83e5d5f-ebdb-40bd-bede-617a7edba9e0-kube-api-access-8znwj\") pod \"a83e5d5f-ebdb-40bd-bede-617a7edba9e0\" (UID: \"a83e5d5f-ebdb-40bd-bede-617a7edba9e0\") " Mar 14 09:04:03 crc kubenswrapper[4956]: I0314 09:04:03.769432 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83e5d5f-ebdb-40bd-bede-617a7edba9e0-kube-api-access-8znwj" (OuterVolumeSpecName: "kube-api-access-8znwj") pod "a83e5d5f-ebdb-40bd-bede-617a7edba9e0" (UID: "a83e5d5f-ebdb-40bd-bede-617a7edba9e0"). InnerVolumeSpecName "kube-api-access-8znwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:03 crc kubenswrapper[4956]: I0314 09:04:03.865583 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8znwj\" (UniqueName: \"kubernetes.io/projected/a83e5d5f-ebdb-40bd-bede-617a7edba9e0-kube-api-access-8znwj\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:04 crc kubenswrapper[4956]: I0314 09:04:04.336944 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-z8grt" event={"ID":"a83e5d5f-ebdb-40bd-bede-617a7edba9e0","Type":"ContainerDied","Data":"0ed819120a17ea230537d614dee5e2bf6a666f4f109fc563646cb9486b0d6b5e"} Mar 14 09:04:04 crc kubenswrapper[4956]: I0314 09:04:04.336981 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed819120a17ea230537d614dee5e2bf6a666f4f109fc563646cb9486b0d6b5e" Mar 14 09:04:04 crc kubenswrapper[4956]: I0314 09:04:04.336997 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-z8grt" Mar 14 09:04:25 crc kubenswrapper[4956]: I0314 09:04:25.423771 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:04:25 crc kubenswrapper[4956]: I0314 09:04:25.424319 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:04:25 crc kubenswrapper[4956]: I0314 09:04:25.424359 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:04:25 crc kubenswrapper[4956]: I0314 09:04:25.424895 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5af7eb19f784abccebf2c980453895184c4d674e7cf075f6fdc0e70b952b563"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:04:25 crc kubenswrapper[4956]: I0314 09:04:25.424961 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://b5af7eb19f784abccebf2c980453895184c4d674e7cf075f6fdc0e70b952b563" gracePeriod=600 Mar 14 09:04:26 crc kubenswrapper[4956]: I0314 09:04:26.219280 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="b5af7eb19f784abccebf2c980453895184c4d674e7cf075f6fdc0e70b952b563" exitCode=0 Mar 14 09:04:26 crc kubenswrapper[4956]: I0314 09:04:26.219369 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"b5af7eb19f784abccebf2c980453895184c4d674e7cf075f6fdc0e70b952b563"} Mar 14 09:04:26 crc kubenswrapper[4956]: I0314 09:04:26.219834 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"f3de809325f5e89aa2de74df02911d0fee80e431b2a2454fac310773a12e5f0a"} Mar 14 09:04:26 crc kubenswrapper[4956]: I0314 09:04:26.219856 4956 scope.go:117] "RemoveContainer" containerID="b126c5e8dac0183ac3f1b4d3f310791b0219224ca6f75c23ed7ffbf39bd12cee" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.140094 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557986-xg5j9"] Mar 14 09:06:00 crc kubenswrapper[4956]: E0314 09:06:00.141314 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83e5d5f-ebdb-40bd-bede-617a7edba9e0" containerName="oc" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.141334 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83e5d5f-ebdb-40bd-bede-617a7edba9e0" containerName="oc" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.141518 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83e5d5f-ebdb-40bd-bede-617a7edba9e0" containerName="oc" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.141958 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.144186 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.145364 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.152073 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-xg5j9"] Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.152203 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.341846 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fb9\" (UniqueName: \"kubernetes.io/projected/03f526a2-e2d3-4894-9088-69073b2af2fd-kube-api-access-t2fb9\") pod \"auto-csr-approver-29557986-xg5j9\" (UID: \"03f526a2-e2d3-4894-9088-69073b2af2fd\") " pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.443112 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fb9\" (UniqueName: \"kubernetes.io/projected/03f526a2-e2d3-4894-9088-69073b2af2fd-kube-api-access-t2fb9\") pod \"auto-csr-approver-29557986-xg5j9\" (UID: \"03f526a2-e2d3-4894-9088-69073b2af2fd\") " pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.471467 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fb9\" (UniqueName: \"kubernetes.io/projected/03f526a2-e2d3-4894-9088-69073b2af2fd-kube-api-access-t2fb9\") pod \"auto-csr-approver-29557986-xg5j9\" (UID: \"03f526a2-e2d3-4894-9088-69073b2af2fd\") " pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:00 crc kubenswrapper[4956]: I0314 09:06:00.767098 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:01 crc kubenswrapper[4956]: I0314 09:06:01.140831 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-xg5j9"] Mar 14 09:06:01 crc kubenswrapper[4956]: W0314 09:06:01.148136 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f526a2_e2d3_4894_9088_69073b2af2fd.slice/crio-6b9ef6d376fc03e68f394b6ce3624e766edbc2eee434ec418914edc6d8c0bfd2 WatchSource:0}: Error finding container 6b9ef6d376fc03e68f394b6ce3624e766edbc2eee434ec418914edc6d8c0bfd2: Status 404 returned error can't find the container with id 6b9ef6d376fc03e68f394b6ce3624e766edbc2eee434ec418914edc6d8c0bfd2 Mar 14 09:06:01 crc kubenswrapper[4956]: I0314 09:06:01.150499 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:06:01 crc kubenswrapper[4956]: I0314 09:06:01.731240 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" event={"ID":"03f526a2-e2d3-4894-9088-69073b2af2fd","Type":"ContainerStarted","Data":"6b9ef6d376fc03e68f394b6ce3624e766edbc2eee434ec418914edc6d8c0bfd2"} Mar 14 09:06:02 crc kubenswrapper[4956]: I0314 09:06:02.740666 4956 generic.go:334] "Generic (PLEG): container finished" podID="03f526a2-e2d3-4894-9088-69073b2af2fd" containerID="ebf340fa253d07b31a51fb0a4060c27fc5e0c80c1c4c4539b91403c6683bd953" exitCode=0 Mar 14 09:06:02 crc kubenswrapper[4956]: I0314 09:06:02.740849 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" event={"ID":"03f526a2-e2d3-4894-9088-69073b2af2fd","Type":"ContainerDied","Data":"ebf340fa253d07b31a51fb0a4060c27fc5e0c80c1c4c4539b91403c6683bd953"} Mar 14 09:06:03 crc kubenswrapper[4956]: I0314 09:06:03.944467 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:04 crc kubenswrapper[4956]: I0314 09:06:04.083668 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2fb9\" (UniqueName: \"kubernetes.io/projected/03f526a2-e2d3-4894-9088-69073b2af2fd-kube-api-access-t2fb9\") pod \"03f526a2-e2d3-4894-9088-69073b2af2fd\" (UID: \"03f526a2-e2d3-4894-9088-69073b2af2fd\") " Mar 14 09:06:04 crc kubenswrapper[4956]: I0314 09:06:04.088919 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f526a2-e2d3-4894-9088-69073b2af2fd-kube-api-access-t2fb9" (OuterVolumeSpecName: "kube-api-access-t2fb9") pod "03f526a2-e2d3-4894-9088-69073b2af2fd" (UID: "03f526a2-e2d3-4894-9088-69073b2af2fd"). InnerVolumeSpecName "kube-api-access-t2fb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:04 crc kubenswrapper[4956]: I0314 09:06:04.184875 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2fb9\" (UniqueName: \"kubernetes.io/projected/03f526a2-e2d3-4894-9088-69073b2af2fd-kube-api-access-t2fb9\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:04 crc kubenswrapper[4956]: I0314 09:06:04.753138 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" event={"ID":"03f526a2-e2d3-4894-9088-69073b2af2fd","Type":"ContainerDied","Data":"6b9ef6d376fc03e68f394b6ce3624e766edbc2eee434ec418914edc6d8c0bfd2"} Mar 14 09:06:04 crc kubenswrapper[4956]: I0314 09:06:04.753171 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-xg5j9" Mar 14 09:06:04 crc kubenswrapper[4956]: I0314 09:06:04.753191 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9ef6d376fc03e68f394b6ce3624e766edbc2eee434ec418914edc6d8c0bfd2" Mar 14 09:06:05 crc kubenswrapper[4956]: I0314 09:06:05.010903 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-v2mp9"] Mar 14 09:06:05 crc kubenswrapper[4956]: I0314 09:06:05.015857 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-v2mp9"] Mar 14 09:06:05 crc kubenswrapper[4956]: I0314 09:06:05.216456 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c" path="/var/lib/kubelet/pods/f8c4ffd1-3ad7-4eef-bbec-b48d5a86ee5c/volumes" Mar 14 09:06:25 crc kubenswrapper[4956]: I0314 09:06:25.423870 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:06:25 crc kubenswrapper[4956]: I0314 09:06:25.424552 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:06:35 crc kubenswrapper[4956]: I0314 09:06:35.546853 4956 scope.go:117] "RemoveContainer" containerID="b5517deab51d671273b4cea4f69e3a120cc15c2c5c4c6c8297d475a4727b8c01" Mar 14 09:06:55 crc kubenswrapper[4956]: I0314 09:06:55.423368 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:06:55 crc kubenswrapper[4956]: I0314 09:06:55.424027 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:07:25 crc kubenswrapper[4956]: I0314 09:07:25.423341 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:07:25 crc kubenswrapper[4956]: I0314 09:07:25.423799 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:07:25 crc kubenswrapper[4956]: I0314 09:07:25.423843 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:07:25 crc kubenswrapper[4956]: I0314 09:07:25.424327 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3de809325f5e89aa2de74df02911d0fee80e431b2a2454fac310773a12e5f0a"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:07:25 crc kubenswrapper[4956]: I0314 09:07:25.424372 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://f3de809325f5e89aa2de74df02911d0fee80e431b2a2454fac310773a12e5f0a" gracePeriod=600 Mar 14 09:07:26 crc kubenswrapper[4956]: I0314 09:07:26.326370 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="f3de809325f5e89aa2de74df02911d0fee80e431b2a2454fac310773a12e5f0a" exitCode=0 Mar 14 09:07:26 crc kubenswrapper[4956]: I0314 09:07:26.326463 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"f3de809325f5e89aa2de74df02911d0fee80e431b2a2454fac310773a12e5f0a"} Mar 14 09:07:26 crc kubenswrapper[4956]: I0314 09:07:26.326903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"a47c6c00ee731cd8c4ec91df013e1936bd43fead85afc21e044951f9ea4d95f7"} Mar 14 09:07:26 crc kubenswrapper[4956]: I0314 09:07:26.326965 4956 scope.go:117] "RemoveContainer" containerID="b5af7eb19f784abccebf2c980453895184c4d674e7cf075f6fdc0e70b952b563" Mar 14 09:07:35 crc kubenswrapper[4956]: I0314 09:07:35.601115 4956 scope.go:117] "RemoveContainer" containerID="38ef84b9567c4183943626c57109c3afa6f7da31552e7a5ac819292bf95fc7be" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.135112 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557988-hp29s"] Mar 14 09:08:00 crc kubenswrapper[4956]: E0314 09:08:00.135901 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f526a2-e2d3-4894-9088-69073b2af2fd" containerName="oc" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.135918 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f526a2-e2d3-4894-9088-69073b2af2fd" containerName="oc" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.136037 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f526a2-e2d3-4894-9088-69073b2af2fd" containerName="oc" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.136462 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.140441 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.140473 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.140823 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.141165 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-hp29s"] Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.179315 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tswk\" (UniqueName: \"kubernetes.io/projected/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af-kube-api-access-6tswk\") pod \"auto-csr-approver-29557988-hp29s\" (UID: \"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af\") " pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.280712 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tswk\" (UniqueName: \"kubernetes.io/projected/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af-kube-api-access-6tswk\") pod \"auto-csr-approver-29557988-hp29s\" (UID: \"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af\") " pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.305593 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tswk\" (UniqueName: \"kubernetes.io/projected/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af-kube-api-access-6tswk\") pod \"auto-csr-approver-29557988-hp29s\" (UID: \"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af\") " pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.450969 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.646497 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-hp29s"] Mar 14 09:08:00 crc kubenswrapper[4956]: I0314 09:08:00.886382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-hp29s" event={"ID":"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af","Type":"ContainerStarted","Data":"52c87394573cc215c56d19ccd0fcfe3ea7029868d93e1bb4cbf6e186766206ca"} Mar 14 09:08:03 crc kubenswrapper[4956]: I0314 09:08:03.908688 4956 generic.go:334] "Generic (PLEG): container finished" podID="ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af" containerID="1ed45b2cc45b07140a5e8d2eb88b089bbaf5997b0fc8775aa61f71a8adaa2d1d" exitCode=0 Mar 14 09:08:03 crc kubenswrapper[4956]: I0314 09:08:03.908806 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-hp29s" event={"ID":"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af","Type":"ContainerDied","Data":"1ed45b2cc45b07140a5e8d2eb88b089bbaf5997b0fc8775aa61f71a8adaa2d1d"} Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.138282 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.335000 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tswk\" (UniqueName: \"kubernetes.io/projected/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af-kube-api-access-6tswk\") pod \"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af\" (UID: \"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af\") " Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.339685 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af-kube-api-access-6tswk" (OuterVolumeSpecName: "kube-api-access-6tswk") pod "ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af" (UID: "ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af"). InnerVolumeSpecName "kube-api-access-6tswk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.435884 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tswk\" (UniqueName: \"kubernetes.io/projected/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af-kube-api-access-6tswk\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.926981 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-hp29s" event={"ID":"ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af","Type":"ContainerDied","Data":"52c87394573cc215c56d19ccd0fcfe3ea7029868d93e1bb4cbf6e186766206ca"} Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.927023 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c87394573cc215c56d19ccd0fcfe3ea7029868d93e1bb4cbf6e186766206ca" Mar 14 09:08:05 crc kubenswrapper[4956]: I0314 09:08:05.927068 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-hp29s" Mar 14 09:08:06 crc kubenswrapper[4956]: I0314 09:08:06.192260 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-vxp8p"] Mar 14 09:08:06 crc kubenswrapper[4956]: I0314 09:08:06.195413 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-vxp8p"] Mar 14 09:08:07 crc kubenswrapper[4956]: I0314 09:08:07.222076 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f200e5-e07e-4c92-a04a-1ed65b8e44ab" path="/var/lib/kubelet/pods/a6f200e5-e07e-4c92-a04a-1ed65b8e44ab/volumes" Mar 14 09:08:35 crc kubenswrapper[4956]: I0314 09:08:35.641130 4956 scope.go:117] "RemoveContainer" containerID="bb6518e780204b6f4d2781048c3386bfb7a58fbbfe600bf0f999d0c93ff5bd1d" Mar 14 09:08:35 crc kubenswrapper[4956]: I0314 09:08:35.665684 4956 scope.go:117] "RemoveContainer" containerID="8eadb51a233f4b014f70bbfe5913d9b0ff694d111108f7fd2a90cb9e27e5161f" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.021631 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl"] Mar 14 09:08:59 crc kubenswrapper[4956]: E0314 09:08:59.022441 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af" containerName="oc" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.022460 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af" containerName="oc" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.022620 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af" containerName="oc" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.023578 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.026061 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.031666 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl"] Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.089938 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.090731 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.091193 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxt94\" (UniqueName: \"kubernetes.io/projected/804193a1-2db0-4ac9-a126-4c79735f8302-kube-api-access-fxt94\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.192540 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxt94\" (UniqueName: \"kubernetes.io/projected/804193a1-2db0-4ac9-a126-4c79735f8302-kube-api-access-fxt94\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.192613 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.192640 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.193387 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.193505 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.211151 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxt94\" (UniqueName: \"kubernetes.io/projected/804193a1-2db0-4ac9-a126-4c79735f8302-kube-api-access-fxt94\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.341128 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:08:59 crc kubenswrapper[4956]: I0314 09:08:59.524227 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl"] Mar 14 09:09:00 crc kubenswrapper[4956]: I0314 09:09:00.287183 4956 generic.go:334] "Generic (PLEG): container finished" podID="804193a1-2db0-4ac9-a126-4c79735f8302" containerID="42ddbf72ed46a71e514e85c27fb7af7de763555b78707829bae6aaeb1419c433" exitCode=0 Mar 14 09:09:00 crc kubenswrapper[4956]: I0314 09:09:00.287419 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" event={"ID":"804193a1-2db0-4ac9-a126-4c79735f8302","Type":"ContainerDied","Data":"42ddbf72ed46a71e514e85c27fb7af7de763555b78707829bae6aaeb1419c433"} Mar 14 09:09:00 crc kubenswrapper[4956]: I0314 09:09:00.288259 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" event={"ID":"804193a1-2db0-4ac9-a126-4c79735f8302","Type":"ContainerStarted","Data":"de421d6da8a06fd94384ad93ebdf36f7e1586277defc5d3a138f6639f02f3966"} Mar 14 09:09:03 crc kubenswrapper[4956]: I0314 09:09:03.305330 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" event={"ID":"804193a1-2db0-4ac9-a126-4c79735f8302","Type":"ContainerStarted","Data":"0d671fbbae77963780b5789f3e1932fd167984804b8384314b8e31050f4f35c0"} Mar 14 09:09:04 crc kubenswrapper[4956]: I0314 09:09:04.311763 4956 generic.go:334] "Generic (PLEG): container finished" podID="804193a1-2db0-4ac9-a126-4c79735f8302" containerID="0d671fbbae77963780b5789f3e1932fd167984804b8384314b8e31050f4f35c0" exitCode=0 Mar 14 09:09:04 crc kubenswrapper[4956]: I0314 09:09:04.311812 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" event={"ID":"804193a1-2db0-4ac9-a126-4c79735f8302","Type":"ContainerDied","Data":"0d671fbbae77963780b5789f3e1932fd167984804b8384314b8e31050f4f35c0"} Mar 14 09:09:05 crc kubenswrapper[4956]: I0314 09:09:05.319329 4956 generic.go:334] "Generic (PLEG): container finished" podID="804193a1-2db0-4ac9-a126-4c79735f8302" containerID="2eac28cd4c321c1981cb115b971f6ceade1f67e8f214e1642e8ce7d5eede5406" exitCode=0 Mar 14 09:09:05 crc kubenswrapper[4956]: I0314 09:09:05.319416 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" event={"ID":"804193a1-2db0-4ac9-a126-4c79735f8302","Type":"ContainerDied","Data":"2eac28cd4c321c1981cb115b971f6ceade1f67e8f214e1642e8ce7d5eede5406"} Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.531063 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.647257 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-bundle\") pod \"804193a1-2db0-4ac9-a126-4c79735f8302\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.647305 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxt94\" (UniqueName: \"kubernetes.io/projected/804193a1-2db0-4ac9-a126-4c79735f8302-kube-api-access-fxt94\") pod \"804193a1-2db0-4ac9-a126-4c79735f8302\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.647339 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-util\") pod \"804193a1-2db0-4ac9-a126-4c79735f8302\" (UID: \"804193a1-2db0-4ac9-a126-4c79735f8302\") " Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.649583 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-bundle" (OuterVolumeSpecName: "bundle") pod "804193a1-2db0-4ac9-a126-4c79735f8302" (UID: "804193a1-2db0-4ac9-a126-4c79735f8302"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.653270 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804193a1-2db0-4ac9-a126-4c79735f8302-kube-api-access-fxt94" (OuterVolumeSpecName: "kube-api-access-fxt94") pod "804193a1-2db0-4ac9-a126-4c79735f8302" (UID: "804193a1-2db0-4ac9-a126-4c79735f8302"). InnerVolumeSpecName "kube-api-access-fxt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.657920 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-util" (OuterVolumeSpecName: "util") pod "804193a1-2db0-4ac9-a126-4c79735f8302" (UID: "804193a1-2db0-4ac9-a126-4c79735f8302"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.748856 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.748894 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxt94\" (UniqueName: \"kubernetes.io/projected/804193a1-2db0-4ac9-a126-4c79735f8302-kube-api-access-fxt94\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:06 crc kubenswrapper[4956]: I0314 09:09:06.748907 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/804193a1-2db0-4ac9-a126-4c79735f8302-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:07 crc kubenswrapper[4956]: I0314 09:09:07.336984 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" event={"ID":"804193a1-2db0-4ac9-a126-4c79735f8302","Type":"ContainerDied","Data":"de421d6da8a06fd94384ad93ebdf36f7e1586277defc5d3a138f6639f02f3966"} Mar 14 09:09:07 crc kubenswrapper[4956]: I0314 09:09:07.337019 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de421d6da8a06fd94384ad93ebdf36f7e1586277defc5d3a138f6639f02f3966" Mar 14 09:09:07 crc kubenswrapper[4956]: I0314 09:09:07.337134 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.251401 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj4pg"] Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252212 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-controller" containerID="cri-o://ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252297 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="nbdb" containerID="cri-o://721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252397 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="northd" containerID="cri-o://179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252446 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252512 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-node" containerID="cri-o://6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252568 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-acl-logging" containerID="cri-o://84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.252752 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="sbdb" containerID="cri-o://02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.288844 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" containerID="cri-o://641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" gracePeriod=30 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.352053 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgnxb_7528e098-09d4-436f-a32d-a0e82e76b8e0/kube-multus/1.log" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.352618 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgnxb_7528e098-09d4-436f-a32d-a0e82e76b8e0/kube-multus/0.log" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.352654 4956 generic.go:334] "Generic (PLEG): container finished" podID="7528e098-09d4-436f-a32d-a0e82e76b8e0" containerID="e11575d346470f0c65bf883c0676009985f639d05b04ccb994919585ff0ae99a" exitCode=2 Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.352683 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgnxb" event={"ID":"7528e098-09d4-436f-a32d-a0e82e76b8e0","Type":"ContainerDied","Data":"e11575d346470f0c65bf883c0676009985f639d05b04ccb994919585ff0ae99a"} Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.352713 4956 scope.go:117] "RemoveContainer" containerID="cf0d744bef0eb2db5bac4d9be242309ec646b1106a3017b861efaf49ea589767" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.353198 4956 scope.go:117] "RemoveContainer" containerID="e11575d346470f0c65bf883c0676009985f639d05b04ccb994919585ff0ae99a" Mar 14 09:09:09 crc kubenswrapper[4956]: E0314 09:09:09.353381 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-sgnxb_openshift-multus(7528e098-09d4-436f-a32d-a0e82e76b8e0)\"" pod="openshift-multus/multus-sgnxb" podUID="7528e098-09d4-436f-a32d-a0e82e76b8e0" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.994762 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/2.log" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.996961 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovn-acl-logging/0.log" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.997391 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovn-controller/0.log" Mar 14 09:09:09 crc kubenswrapper[4956]: I0314 09:09:09.997844 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.044830 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qs67b"] Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045093 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-node" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045116 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-node" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045132 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045141 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045151 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kubecfg-setup" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045160 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kubecfg-setup" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045175 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-acl-logging" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045183 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-acl-logging" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045193 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="pull" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045200 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="pull" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045211 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="util" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045218 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="util" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045228 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="extract" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045235 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="extract" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045248 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="nbdb" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045257 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="nbdb" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045266 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="sbdb" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045274 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="sbdb" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045283 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045290 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045301 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="northd" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045308 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="northd" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045319 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045327 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045340 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045347 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045355 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045378 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045513 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-node" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045531 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045539 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045551 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045561 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="sbdb" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045571 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045580 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="804193a1-2db0-4ac9-a126-4c79735f8302" containerName="extract" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045590 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovn-acl-logging" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045598 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="northd" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045607 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045616 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="nbdb" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.045734 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045745 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.045852 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerName="ovnkube-controller" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.047653 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.087981 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-ovn\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088045 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-systemd\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088071 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-node-log\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088119 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-kubelet\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088126 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088140 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-netns\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088183 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088183 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-node-log" (OuterVolumeSpecName: "node-log") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088213 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-log-socket\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088254 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-openvswitch\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088280 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkxwz\" (UniqueName: \"kubernetes.io/projected/57d4b4cb-2115-421e-8f2a-491ec851328c-kube-api-access-wkxwz\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088304 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-var-lib-openvswitch\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088266 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088298 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088311 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-log-socket" (OuterVolumeSpecName: "log-socket") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088362 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088329 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088342 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-ovn-kubernetes\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088405 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-env-overrides\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088434 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-slash\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088452 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-bin\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088476 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-script-lib\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088471 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-slash" (OuterVolumeSpecName: "host-slash") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088513 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57d4b4cb-2115-421e-8f2a-491ec851328c-ovn-node-metrics-cert\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088548 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-etc-openvswitch\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088562 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-netd\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088513 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088582 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-systemd-units\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088602 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088613 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088625 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088651 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-config\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"57d4b4cb-2115-421e-8f2a-491ec851328c\" (UID: \"57d4b4cb-2115-421e-8f2a-491ec851328c\") " Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088789 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088849 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088862 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ada84a5-3502-4dba-a59a-3742895ecf23-ovn-node-metrics-cert\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088887 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088902 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-systemd\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088929 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76nv\" (UniqueName: \"kubernetes.io/projected/7ada84a5-3502-4dba-a59a-3742895ecf23-kube-api-access-b76nv\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088956 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-slash\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-cni-bin\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.088998 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-cni-netd\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089020 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-env-overrides\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089089 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089279 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089311 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-systemd-units\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089341 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-node-log\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089365 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-run-ovn-kubernetes\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089398 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-run-netns\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089422 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-ovn\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089444 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-kubelet\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089467 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-var-lib-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089506 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089527 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-ovnkube-config\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089552 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-log-socket\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089579 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-ovnkube-script-lib\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089629 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-etc-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089718 4956 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-log-socket\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089733 4956 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089746 4956 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089757 4956 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089799 4956 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089818 4956 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089830 4956 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089841 4956 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-slash\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089851 4956 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089864 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089874 4956 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089884 4956 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089892 4956 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089900 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57d4b4cb-2115-421e-8f2a-491ec851328c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089908 4956 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089917 4956 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.089925 4956 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-node-log\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.093158 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d4b4cb-2115-421e-8f2a-491ec851328c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.095958 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d4b4cb-2115-421e-8f2a-491ec851328c-kube-api-access-wkxwz" (OuterVolumeSpecName: "kube-api-access-wkxwz") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "kube-api-access-wkxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.100765 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "57d4b4cb-2115-421e-8f2a-491ec851328c" (UID: "57d4b4cb-2115-421e-8f2a-491ec851328c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191322 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-etc-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191400 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ada84a5-3502-4dba-a59a-3742895ecf23-ovn-node-metrics-cert\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-systemd\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191437 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b76nv\" (UniqueName: \"kubernetes.io/projected/7ada84a5-3502-4dba-a59a-3742895ecf23-kube-api-access-b76nv\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191447 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-etc-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191454 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-slash\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191468 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-cni-bin\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191493 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-systemd\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191499 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-cni-netd\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191537 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-slash\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191544 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-env-overrides\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191564 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-cni-bin\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191519 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-cni-netd\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191602 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-systemd-units\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191626 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-node-log\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191650 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-run-ovn-kubernetes\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191674 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-run-netns\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191694 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-ovn\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191715 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-kubelet\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191747 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-node-log\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191751 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-var-lib-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191776 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-var-lib-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191777 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191793 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-openvswitch\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191809 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-ovnkube-config\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191817 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191832 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-log-socket\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191845 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-systemd-units\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191853 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-ovnkube-script-lib\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191903 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-kubelet\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191929 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-run-netns\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191935 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-host-run-ovn-kubernetes\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191948 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-run-ovn\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.191973 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ada84a5-3502-4dba-a59a-3742895ecf23-log-socket\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.192117 4956 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57d4b4cb-2115-421e-8f2a-491ec851328c-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.192137 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkxwz\" (UniqueName: \"kubernetes.io/projected/57d4b4cb-2115-421e-8f2a-491ec851328c-kube-api-access-wkxwz\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.192150 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57d4b4cb-2115-421e-8f2a-491ec851328c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.192175 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-env-overrides\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.192456 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-ovnkube-config\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.192466 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ada84a5-3502-4dba-a59a-3742895ecf23-ovnkube-script-lib\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.194871 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ada84a5-3502-4dba-a59a-3742895ecf23-ovn-node-metrics-cert\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.207747 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76nv\" (UniqueName: \"kubernetes.io/projected/7ada84a5-3502-4dba-a59a-3742895ecf23-kube-api-access-b76nv\") pod \"ovnkube-node-qs67b\" (UID: \"7ada84a5-3502-4dba-a59a-3742895ecf23\") " pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.357458 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgnxb_7528e098-09d4-436f-a32d-a0e82e76b8e0/kube-multus/1.log" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.360937 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovnkube-controller/2.log" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.362782 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.363763 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovn-acl-logging/0.log" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.368239 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj4pg_57d4b4cb-2115-421e-8f2a-491ec851328c/ovn-controller/0.log" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.368932 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" exitCode=0 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.368970 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" exitCode=0 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.368980 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" exitCode=0 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.368990 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" exitCode=0 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.368997 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" exitCode=0 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369006 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" exitCode=0 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369014 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" exitCode=143 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369025 4956 generic.go:334] "Generic (PLEG): container finished" podID="57d4b4cb-2115-421e-8f2a-491ec851328c" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" exitCode=143 Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369049 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369098 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369107 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369117 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369143 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369153 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369162 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369168 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369173 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369178 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369185 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369190 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369197 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369202 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369208 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369215 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369222 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369228 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369233 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369238 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369243 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369249 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369256 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369261 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369267 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369274 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369281 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369287 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369292 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369297 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369302 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369307 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369312 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369317 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369321 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369326 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369332 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" event={"ID":"57d4b4cb-2115-421e-8f2a-491ec851328c","Type":"ContainerDied","Data":"ac6d0c7d195480bd7fed164793aa36cad36f6abe33425c6eb81abc79f7c91832"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369346 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369352 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369358 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369363 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369370 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369376 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369383 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369390 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369410 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369417 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369432 4956 scope.go:117] "RemoveContainer" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.369625 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj4pg" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.410747 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.425691 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj4pg"] Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.433954 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj4pg"] Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.443902 4956 scope.go:117] "RemoveContainer" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.472953 4956 scope.go:117] "RemoveContainer" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.488003 4956 scope.go:117] "RemoveContainer" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.501634 4956 scope.go:117] "RemoveContainer" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.526384 4956 scope.go:117] "RemoveContainer" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.599959 4956 scope.go:117] "RemoveContainer" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.616217 4956 scope.go:117] "RemoveContainer" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.637977 4956 scope.go:117] "RemoveContainer" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.663675 4956 scope.go:117] "RemoveContainer" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.665368 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": container with ID starting with 641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77 not found: ID does not exist" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.665451 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} err="failed to get container status \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": rpc error: code = NotFound desc = could not find container \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": container with ID starting with 641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.665516 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.665924 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": container with ID starting with ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3 not found: ID does not exist" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.665967 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} err="failed to get container status \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": rpc error: code = NotFound desc = could not find container \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": container with ID starting with ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.665996 4956 scope.go:117] "RemoveContainer" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.666422 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": container with ID starting with 02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa not found: ID does not exist" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.666517 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} err="failed to get container status \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": rpc error: code = NotFound desc = could not find container \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": container with ID starting with 02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.666567 4956 scope.go:117] "RemoveContainer" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.669475 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": container with ID starting with 721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f not found: ID does not exist" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.669530 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} err="failed to get container status \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": rpc error: code = NotFound desc = could not find container \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": container with ID starting with 721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.669551 4956 scope.go:117] "RemoveContainer" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.669884 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": container with ID starting with 179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2 not found: ID does not exist" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.669902 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} err="failed to get container status \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": rpc error: code = NotFound desc = could not find container \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": container with ID starting with 179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.669917 4956 scope.go:117] "RemoveContainer" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.670347 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": container with ID starting with 505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c not found: ID does not exist" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.670366 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} err="failed to get container status \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": rpc error: code = NotFound desc = could not find container \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": container with ID starting with 505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.670379 4956 scope.go:117] "RemoveContainer" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.670776 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": container with ID starting with 6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4 not found: ID does not exist" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.670819 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} err="failed to get container status \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": rpc error: code = NotFound desc = could not find container \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": container with ID starting with 6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.670853 4956 scope.go:117] "RemoveContainer" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.671489 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": container with ID starting with 84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f not found: ID does not exist" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.671514 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} err="failed to get container status \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": rpc error: code = NotFound desc = could not find container \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": container with ID starting with 84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.671531 4956 scope.go:117] "RemoveContainer" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.671849 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": container with ID starting with ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b not found: ID does not exist" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.671868 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} err="failed to get container status \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": rpc error: code = NotFound desc = could not find container \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": container with ID starting with ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.671881 4956 scope.go:117] "RemoveContainer" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" Mar 14 09:09:10 crc kubenswrapper[4956]: E0314 09:09:10.672267 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": container with ID starting with 37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b not found: ID does not exist" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.672282 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} err="failed to get container status \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": rpc error: code = NotFound desc = could not find container \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": container with ID starting with 37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.672297 4956 scope.go:117] "RemoveContainer" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.672576 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} err="failed to get container status \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": rpc error: code = NotFound desc = could not find container \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": container with ID starting with 641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.672592 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.672926 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} err="failed to get container status \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": rpc error: code = NotFound desc = could not find container \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": container with ID starting with ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.672941 4956 scope.go:117] "RemoveContainer" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.673297 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} err="failed to get container status \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": rpc error: code = NotFound desc = could not find container \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": container with ID starting with 02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.673311 4956 scope.go:117] "RemoveContainer" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.673551 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} err="failed to get container status \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": rpc error: code = NotFound desc = could not find container \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": container with ID starting with 721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.673574 4956 scope.go:117] "RemoveContainer" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.673945 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} err="failed to get container status \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": rpc error: code = NotFound desc = could not find container \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": container with ID starting with 179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.673966 4956 scope.go:117] "RemoveContainer" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.674380 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} err="failed to get container status \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": rpc error: code = NotFound desc = could not find container \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": container with ID starting with 505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.674508 4956 scope.go:117] "RemoveContainer" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.675012 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} err="failed to get container status \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": rpc error: code = NotFound desc = could not find container \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": container with ID starting with 6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.675037 4956 scope.go:117] "RemoveContainer" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.675424 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} err="failed to get container status \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": rpc error: code = NotFound desc = could not find container \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": container with ID starting with 84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.675530 4956 scope.go:117] "RemoveContainer" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.676141 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} err="failed to get container status \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": rpc error: code = NotFound desc = could not find container \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": container with ID starting with ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.676165 4956 scope.go:117] "RemoveContainer" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.676458 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} err="failed to get container status \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": rpc error: code = NotFound desc = could not find container \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": container with ID starting with 37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.676552 4956 scope.go:117] "RemoveContainer" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.676932 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} err="failed to get container status \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": rpc error: code = NotFound desc = could not find container \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": container with ID starting with 641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.676991 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.677271 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} err="failed to get container status \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": rpc error: code = NotFound desc = could not find container \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": container with ID starting with ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.677286 4956 scope.go:117] "RemoveContainer" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.677493 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} err="failed to get container status \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": rpc error: code = NotFound desc = could not find container \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": container with ID starting with 02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.677505 4956 scope.go:117] "RemoveContainer" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.677846 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} err="failed to get container status \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": rpc error: code = NotFound desc = could not find container \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": container with ID starting with 721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.677874 4956 scope.go:117] "RemoveContainer" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.678103 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} err="failed to get container status \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": rpc error: code = NotFound desc = could not find container \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": container with ID starting with 179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.678127 4956 scope.go:117] "RemoveContainer" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.678338 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} err="failed to get container status \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": rpc error: code = NotFound desc = could not find container \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": container with ID starting with 505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.678468 4956 scope.go:117] "RemoveContainer" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.678961 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} err="failed to get container status \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": rpc error: code = NotFound desc = could not find container \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": container with ID starting with 6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.678992 4956 scope.go:117] "RemoveContainer" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.679200 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} err="failed to get container status \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": rpc error: code = NotFound desc = could not find container \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": container with ID starting with 84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.679220 4956 scope.go:117] "RemoveContainer" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.679534 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} err="failed to get container status \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": rpc error: code = NotFound desc = could not find container \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": container with ID starting with ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.679553 4956 scope.go:117] "RemoveContainer" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.679937 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} err="failed to get container status \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": rpc error: code = NotFound desc = could not find container \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": container with ID starting with 37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.679956 4956 scope.go:117] "RemoveContainer" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.680195 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} err="failed to get container status \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": rpc error: code = NotFound desc = could not find container \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": container with ID starting with 641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.680218 4956 scope.go:117] "RemoveContainer" containerID="ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.680464 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3"} err="failed to get container status \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": rpc error: code = NotFound desc = could not find container \"ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3\": container with ID starting with ef98b27380cd73ef0aba67a47b02a30ba31202b4c2e6c0224bf3d8b0d7abecf3 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.680492 4956 scope.go:117] "RemoveContainer" containerID="02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.680804 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa"} err="failed to get container status \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": rpc error: code = NotFound desc = could not find container \"02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa\": container with ID starting with 02f19415a77366bc81c4461046135ecd91da660f64680706fb38bbcde2349faa not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.680842 4956 scope.go:117] "RemoveContainer" containerID="721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.681154 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f"} err="failed to get container status \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": rpc error: code = NotFound desc = could not find container \"721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f\": container with ID starting with 721a509b5f87c49ee407ba5d907592a51df3ea91b8775f494f27071bbf9ca02f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.681193 4956 scope.go:117] "RemoveContainer" containerID="179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.681460 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2"} err="failed to get container status \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": rpc error: code = NotFound desc = could not find container \"179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2\": container with ID starting with 179fe3c5696da9dd8fd68cb27d2423fc8e037e5e03503f6f582135ba089cbae2 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.681495 4956 scope.go:117] "RemoveContainer" containerID="505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.681872 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c"} err="failed to get container status \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": rpc error: code = NotFound desc = could not find container \"505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c\": container with ID starting with 505f0a9211c35e69a4f3ce79cc7ab13ce1265557ae14155d7d9ff1eb14b6a93c not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.681890 4956 scope.go:117] "RemoveContainer" containerID="6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.682227 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4"} err="failed to get container status \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": rpc error: code = NotFound desc = could not find container \"6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4\": container with ID starting with 6287ed93c28c20c7e5a602c12ce31280ecff68722e17bbd2a204ee46a0440ec4 not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.682248 4956 scope.go:117] "RemoveContainer" containerID="84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.682571 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f"} err="failed to get container status \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": rpc error: code = NotFound desc = could not find container \"84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f\": container with ID starting with 84d995fdf62438fcc0e1a7572603fb93131a238a8add8ffbdf8b5d12d094666f not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.682587 4956 scope.go:117] "RemoveContainer" containerID="ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.682807 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b"} err="failed to get container status \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": rpc error: code = NotFound desc = could not find container \"ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b\": container with ID starting with ade8d6b229a00c7f3cf2d87fda815c93c8a20661c947aadb4bd8be3e97f2cf5b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.682821 4956 scope.go:117] "RemoveContainer" containerID="37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.683133 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b"} err="failed to get container status \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": rpc error: code = NotFound desc = could not find container \"37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b\": container with ID starting with 37aaa3b1c1f9b885a406fb6ea724eed82d04441028c19bd1b048f14ba08e663b not found: ID does not exist" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.683150 4956 scope.go:117] "RemoveContainer" containerID="641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77" Mar 14 09:09:10 crc kubenswrapper[4956]: I0314 09:09:10.683399 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77"} err="failed to get container status \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": rpc error: code = NotFound desc = could not find container \"641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77\": container with ID starting with 641e37b336c8407c9b419a5f13304912916308284863f64820da612f2a6d3a77 not found: ID does not exist" Mar 14 09:09:11 crc kubenswrapper[4956]: I0314 09:09:11.216259 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d4b4cb-2115-421e-8f2a-491ec851328c" path="/var/lib/kubelet/pods/57d4b4cb-2115-421e-8f2a-491ec851328c/volumes" Mar 14 09:09:11 crc kubenswrapper[4956]: I0314 09:09:11.376964 4956 generic.go:334] "Generic (PLEG): container finished" podID="7ada84a5-3502-4dba-a59a-3742895ecf23" containerID="37811ea3e1e26fff303d1d56df590322d1e12c2de6a31c55bc3bf66cddccb303" exitCode=0 Mar 14 09:09:11 crc kubenswrapper[4956]: I0314 09:09:11.377062 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerDied","Data":"37811ea3e1e26fff303d1d56df590322d1e12c2de6a31c55bc3bf66cddccb303"} Mar 14 09:09:11 crc kubenswrapper[4956]: I0314 09:09:11.377408 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"62ee6565a526322c84398c959d52401e945c5fe4ce116d0d9994d797f304aa19"} Mar 14 09:09:12 crc kubenswrapper[4956]: I0314 09:09:12.401515 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"ce6ccb982fc46f31282dcee611429dce6d1fa76ee33faf054af4d2afb3de9652"} Mar 14 09:09:12 crc kubenswrapper[4956]: I0314 09:09:12.401817 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"df16b5c077d2f239da750e043da673b02fee688d5c451432dcbc6d468d71958a"} Mar 14 09:09:12 crc kubenswrapper[4956]: I0314 09:09:12.401826 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"35a256277851502b9f702f1f75cd76b912e9ce7405768639b928585244c37ebb"} Mar 14 09:09:12 crc kubenswrapper[4956]: I0314 09:09:12.401837 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"6faebe507e2e738553bbabcbb4b7640b1280ae52786338dcb2b0c8f41fbd196c"} Mar 14 09:09:12 crc kubenswrapper[4956]: I0314 09:09:12.401846 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"82864372c41dcfd1c7eec50b2f2af6361d746336b4be85e2307419fa3136fc21"} Mar 14 09:09:12 crc kubenswrapper[4956]: I0314 09:09:12.401853 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"1245c839c83207f735c8c688d882d933cafca3aec296c317ab98ec4ab9d6ff4c"} Mar 14 09:09:15 crc kubenswrapper[4956]: I0314 09:09:15.419607 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"dfa2ffc71ad601b0202f5bbd5f3ea600785670fcb67897475b2a96303d4ce8e8"} Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.575623 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-259ww"] Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.576453 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.578638 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-fzftk" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.578880 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.581441 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.649982 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z"] Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.651280 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.658083 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw"] Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.659047 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-h4vx6" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.659060 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.659257 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.674316 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z\" (UID: \"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.674366 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b4vk\" (UniqueName: \"kubernetes.io/projected/c9a78fe1-8b1a-4655-ad98-19c53622c2b1-kube-api-access-2b4vk\") pod \"obo-prometheus-operator-68bc856cb9-259ww\" (UID: \"c9a78fe1-8b1a-4655-ad98-19c53622c2b1\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.674400 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z\" (UID: \"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.776350 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z\" (UID: \"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.776991 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c888df81-af56-4ddd-8857-4952f199f288-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw\" (UID: \"c888df81-af56-4ddd-8857-4952f199f288\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.777023 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c888df81-af56-4ddd-8857-4952f199f288-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw\" (UID: \"c888df81-af56-4ddd-8857-4952f199f288\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.777062 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b4vk\" (UniqueName: \"kubernetes.io/projected/c9a78fe1-8b1a-4655-ad98-19c53622c2b1-kube-api-access-2b4vk\") pod \"obo-prometheus-operator-68bc856cb9-259ww\" (UID: \"c9a78fe1-8b1a-4655-ad98-19c53622c2b1\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.777148 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z\" (UID: \"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.782734 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z\" (UID: \"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.789947 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z\" (UID: \"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.804581 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b4vk\" (UniqueName: \"kubernetes.io/projected/c9a78fe1-8b1a-4655-ad98-19c53622c2b1-kube-api-access-2b4vk\") pod \"obo-prometheus-operator-68bc856cb9-259ww\" (UID: \"c9a78fe1-8b1a-4655-ad98-19c53622c2b1\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.824531 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-q6fzl"] Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.825324 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.827184 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.827542 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-bscrb" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.878568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c888df81-af56-4ddd-8857-4952f199f288-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw\" (UID: \"c888df81-af56-4ddd-8857-4952f199f288\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.878612 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c888df81-af56-4ddd-8857-4952f199f288-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw\" (UID: \"c888df81-af56-4ddd-8857-4952f199f288\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.878655 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm55f\" (UniqueName: \"kubernetes.io/projected/2dd43b3c-f9be-41a8-b1a6-11ab052283c7-kube-api-access-hm55f\") pod \"observability-operator-59bdc8b94-q6fzl\" (UID: \"2dd43b3c-f9be-41a8-b1a6-11ab052283c7\") " pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.878688 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dd43b3c-f9be-41a8-b1a6-11ab052283c7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-q6fzl\" (UID: \"2dd43b3c-f9be-41a8-b1a6-11ab052283c7\") " pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.882908 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c888df81-af56-4ddd-8857-4952f199f288-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw\" (UID: \"c888df81-af56-4ddd-8857-4952f199f288\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.883020 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c888df81-af56-4ddd-8857-4952f199f288-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw\" (UID: \"c888df81-af56-4ddd-8857-4952f199f288\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.892948 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: E0314 09:09:16.913927 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(08bf9b26d02ac2717c51bb9e7a6e9b96f99e79152a6743a7b06d65614d967ca8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:16 crc kubenswrapper[4956]: E0314 09:09:16.914006 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(08bf9b26d02ac2717c51bb9e7a6e9b96f99e79152a6743a7b06d65614d967ca8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: E0314 09:09:16.914029 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(08bf9b26d02ac2717c51bb9e7a6e9b96f99e79152a6743a7b06d65614d967ca8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:16 crc kubenswrapper[4956]: E0314 09:09:16.914087 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-259ww_openshift-operators(c9a78fe1-8b1a-4655-ad98-19c53622c2b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-259ww_openshift-operators(c9a78fe1-8b1a-4655-ad98-19c53622c2b1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(08bf9b26d02ac2717c51bb9e7a6e9b96f99e79152a6743a7b06d65614d967ca8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" podUID="c9a78fe1-8b1a-4655-ad98-19c53622c2b1" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.970863 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.977899 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.979580 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm55f\" (UniqueName: \"kubernetes.io/projected/2dd43b3c-f9be-41a8-b1a6-11ab052283c7-kube-api-access-hm55f\") pod \"observability-operator-59bdc8b94-q6fzl\" (UID: \"2dd43b3c-f9be-41a8-b1a6-11ab052283c7\") " pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.979645 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dd43b3c-f9be-41a8-b1a6-11ab052283c7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-q6fzl\" (UID: \"2dd43b3c-f9be-41a8-b1a6-11ab052283c7\") " pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:16 crc kubenswrapper[4956]: I0314 09:09:16.987264 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dd43b3c-f9be-41a8-b1a6-11ab052283c7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-q6fzl\" (UID: \"2dd43b3c-f9be-41a8-b1a6-11ab052283c7\") " pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.011277 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm55f\" (UniqueName: \"kubernetes.io/projected/2dd43b3c-f9be-41a8-b1a6-11ab052283c7-kube-api-access-hm55f\") pod \"observability-operator-59bdc8b94-q6fzl\" (UID: \"2dd43b3c-f9be-41a8-b1a6-11ab052283c7\") " pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.012147 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2vqmj"] Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.013110 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.020165 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-ll4nx" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.027071 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(c11c9bc7fd14d758b16f4a0bbc87464cb47bff31961c62d67bf1ed72a23def8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.027156 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(c11c9bc7fd14d758b16f4a0bbc87464cb47bff31961c62d67bf1ed72a23def8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.027228 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(c11c9bc7fd14d758b16f4a0bbc87464cb47bff31961c62d67bf1ed72a23def8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.027309 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators(b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators(b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(c11c9bc7fd14d758b16f4a0bbc87464cb47bff31961c62d67bf1ed72a23def8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" podUID="b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.052651 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(33617867f2cea38da78fbadd47554c715d323f52d17cda1a028727c919514a79): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.052702 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(33617867f2cea38da78fbadd47554c715d323f52d17cda1a028727c919514a79): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.052727 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(33617867f2cea38da78fbadd47554c715d323f52d17cda1a028727c919514a79): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.052781 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators(c888df81-af56-4ddd-8857-4952f199f288)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators(c888df81-af56-4ddd-8857-4952f199f288)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(33617867f2cea38da78fbadd47554c715d323f52d17cda1a028727c919514a79): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" podUID="c888df81-af56-4ddd-8857-4952f199f288" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.080516 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f5tf\" (UniqueName: \"kubernetes.io/projected/50a4be30-e002-4c15-b3ef-b3048665261b-kube-api-access-5f5tf\") pod \"perses-operator-5bf474d74f-2vqmj\" (UID: \"50a4be30-e002-4c15-b3ef-b3048665261b\") " pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.080614 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/50a4be30-e002-4c15-b3ef-b3048665261b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2vqmj\" (UID: \"50a4be30-e002-4c15-b3ef-b3048665261b\") " pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.145089 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.167435 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(9cee4db3340bd2d1877dfd5c74be5946985c122fa91f23102375433ed4f0fc9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.167530 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(9cee4db3340bd2d1877dfd5c74be5946985c122fa91f23102375433ed4f0fc9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.167561 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(9cee4db3340bd2d1877dfd5c74be5946985c122fa91f23102375433ed4f0fc9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.167623 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-q6fzl_openshift-operators(2dd43b3c-f9be-41a8-b1a6-11ab052283c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-q6fzl_openshift-operators(2dd43b3c-f9be-41a8-b1a6-11ab052283c7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(9cee4db3340bd2d1877dfd5c74be5946985c122fa91f23102375433ed4f0fc9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" podUID="2dd43b3c-f9be-41a8-b1a6-11ab052283c7" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.182247 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/50a4be30-e002-4c15-b3ef-b3048665261b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2vqmj\" (UID: \"50a4be30-e002-4c15-b3ef-b3048665261b\") " pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.182353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f5tf\" (UniqueName: \"kubernetes.io/projected/50a4be30-e002-4c15-b3ef-b3048665261b-kube-api-access-5f5tf\") pod \"perses-operator-5bf474d74f-2vqmj\" (UID: \"50a4be30-e002-4c15-b3ef-b3048665261b\") " pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.183518 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/50a4be30-e002-4c15-b3ef-b3048665261b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2vqmj\" (UID: \"50a4be30-e002-4c15-b3ef-b3048665261b\") " pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.202137 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f5tf\" (UniqueName: \"kubernetes.io/projected/50a4be30-e002-4c15-b3ef-b3048665261b-kube-api-access-5f5tf\") pod \"perses-operator-5bf474d74f-2vqmj\" (UID: \"50a4be30-e002-4c15-b3ef-b3048665261b\") " pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.338569 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.362514 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(7836bd5fae9a3572e1828739a571da13fd8fa021ab2e5dce99ad9932918c5cf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.362589 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(7836bd5fae9a3572e1828739a571da13fd8fa021ab2e5dce99ad9932918c5cf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.362612 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(7836bd5fae9a3572e1828739a571da13fd8fa021ab2e5dce99ad9932918c5cf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.362658 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-2vqmj_openshift-operators(50a4be30-e002-4c15-b3ef-b3048665261b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-2vqmj_openshift-operators(50a4be30-e002-4c15-b3ef-b3048665261b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(7836bd5fae9a3572e1828739a571da13fd8fa021ab2e5dce99ad9932918c5cf4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" podUID="50a4be30-e002-4c15-b3ef-b3048665261b" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.433218 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" event={"ID":"7ada84a5-3502-4dba-a59a-3742895ecf23","Type":"ContainerStarted","Data":"20783c7ca60618ecb21ced8b38ed66c9b3bd20d0e557897eb9660d7238b0eef7"} Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.433494 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.433509 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.464682 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.471581 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" podStartSLOduration=7.47156521 podStartE2EDuration="7.47156521s" podCreationTimestamp="2026-03-14 09:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:09:17.469807456 +0000 UTC m=+762.982499724" watchObservedRunningTime="2026-03-14 09:09:17.47156521 +0000 UTC m=+762.984257478" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.883823 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z"] Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.883938 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.884439 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.886742 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw"] Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.887098 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.887533 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.918075 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-q6fzl"] Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.918184 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.923123 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.954713 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(7e85a68e014cb4746f612215ba554e24649d7c9f5c7dc8eeb707a0d5415d3fde): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.954795 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(7e85a68e014cb4746f612215ba554e24649d7c9f5c7dc8eeb707a0d5415d3fde): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.954826 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(7e85a68e014cb4746f612215ba554e24649d7c9f5c7dc8eeb707a0d5415d3fde): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.954736 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2vqmj"] Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.954881 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators(b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators(b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_openshift-operators_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193_0(7e85a68e014cb4746f612215ba554e24649d7c9f5c7dc8eeb707a0d5415d3fde): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" podUID="b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.954956 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.955351 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.958327 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-259ww"] Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.958418 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:17 crc kubenswrapper[4956]: I0314 09:09:17.958679 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.963427 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(cb8486db9d33889361b146a11d1b32e474a3b6893e44dde75dc347dc35d133da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.963516 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(cb8486db9d33889361b146a11d1b32e474a3b6893e44dde75dc347dc35d133da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.963546 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(cb8486db9d33889361b146a11d1b32e474a3b6893e44dde75dc347dc35d133da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.963605 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-q6fzl_openshift-operators(2dd43b3c-f9be-41a8-b1a6-11ab052283c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-q6fzl_openshift-operators(2dd43b3c-f9be-41a8-b1a6-11ab052283c7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-q6fzl_openshift-operators_2dd43b3c-f9be-41a8-b1a6-11ab052283c7_0(cb8486db9d33889361b146a11d1b32e474a3b6893e44dde75dc347dc35d133da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" podUID="2dd43b3c-f9be-41a8-b1a6-11ab052283c7" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.970662 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(4c779f98ec2762c49664facad3c029385c20693d1cc1457b2dbf0f7c2d887c46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.970734 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(4c779f98ec2762c49664facad3c029385c20693d1cc1457b2dbf0f7c2d887c46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.970759 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(4c779f98ec2762c49664facad3c029385c20693d1cc1457b2dbf0f7c2d887c46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:17 crc kubenswrapper[4956]: E0314 09:09:17.970801 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators(c888df81-af56-4ddd-8857-4952f199f288)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators(c888df81-af56-4ddd-8857-4952f199f288)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_openshift-operators_c888df81-af56-4ddd-8857-4952f199f288_0(4c779f98ec2762c49664facad3c029385c20693d1cc1457b2dbf0f7c2d887c46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" podUID="c888df81-af56-4ddd-8857-4952f199f288" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.000601 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(7ed12e268807929004f410bc551376c5840a29ff2d1e6d6bcc9915f7a7b2f3a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.000671 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(7ed12e268807929004f410bc551376c5840a29ff2d1e6d6bcc9915f7a7b2f3a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.000696 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(7ed12e268807929004f410bc551376c5840a29ff2d1e6d6bcc9915f7a7b2f3a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.000741 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-259ww_openshift-operators(c9a78fe1-8b1a-4655-ad98-19c53622c2b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-259ww_openshift-operators(c9a78fe1-8b1a-4655-ad98-19c53622c2b1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-259ww_openshift-operators_c9a78fe1-8b1a-4655-ad98-19c53622c2b1_0(7ed12e268807929004f410bc551376c5840a29ff2d1e6d6bcc9915f7a7b2f3a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" podUID="c9a78fe1-8b1a-4655-ad98-19c53622c2b1" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.009611 4956 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(e9fbb76b189ba2535459d28ddc83710c2a5d81a9f6d69f7510704cd2a96ae136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.009684 4956 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(e9fbb76b189ba2535459d28ddc83710c2a5d81a9f6d69f7510704cd2a96ae136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.009706 4956 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(e9fbb76b189ba2535459d28ddc83710c2a5d81a9f6d69f7510704cd2a96ae136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:18 crc kubenswrapper[4956]: E0314 09:09:18.009751 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-2vqmj_openshift-operators(50a4be30-e002-4c15-b3ef-b3048665261b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-2vqmj_openshift-operators(50a4be30-e002-4c15-b3ef-b3048665261b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2vqmj_openshift-operators_50a4be30-e002-4c15-b3ef-b3048665261b_0(e9fbb76b189ba2535459d28ddc83710c2a5d81a9f6d69f7510704cd2a96ae136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" podUID="50a4be30-e002-4c15-b3ef-b3048665261b" Mar 14 09:09:18 crc kubenswrapper[4956]: I0314 09:09:18.438366 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:18 crc kubenswrapper[4956]: I0314 09:09:18.467243 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:24 crc kubenswrapper[4956]: I0314 09:09:24.208847 4956 scope.go:117] "RemoveContainer" containerID="e11575d346470f0c65bf883c0676009985f639d05b04ccb994919585ff0ae99a" Mar 14 09:09:24 crc kubenswrapper[4956]: I0314 09:09:24.476984 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgnxb_7528e098-09d4-436f-a32d-a0e82e76b8e0/kube-multus/1.log" Mar 14 09:09:24 crc kubenswrapper[4956]: I0314 09:09:24.477257 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgnxb" event={"ID":"7528e098-09d4-436f-a32d-a0e82e76b8e0","Type":"ContainerStarted","Data":"f311ee7b203514b8ef4ca3d5460f71c77318db4a203e41d325dddcbd4e6546a1"} Mar 14 09:09:25 crc kubenswrapper[4956]: I0314 09:09:25.424072 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:09:25 crc kubenswrapper[4956]: I0314 09:09:25.424139 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:09:29 crc kubenswrapper[4956]: I0314 09:09:29.208445 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:29 crc kubenswrapper[4956]: I0314 09:09:29.209225 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" Mar 14 09:09:29 crc kubenswrapper[4956]: I0314 09:09:29.413956 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z"] Mar 14 09:09:29 crc kubenswrapper[4956]: W0314 09:09:29.419941 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e5b9f8_a1eb_4cd9_8200_dbf2cbdb4193.slice/crio-2cef3f765faf16b7b8f11f373c1db1ee52198cfc1a28f82aaa70ebbc1ed9fc0d WatchSource:0}: Error finding container 2cef3f765faf16b7b8f11f373c1db1ee52198cfc1a28f82aaa70ebbc1ed9fc0d: Status 404 returned error can't find the container with id 2cef3f765faf16b7b8f11f373c1db1ee52198cfc1a28f82aaa70ebbc1ed9fc0d Mar 14 09:09:29 crc kubenswrapper[4956]: I0314 09:09:29.503278 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" event={"ID":"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193","Type":"ContainerStarted","Data":"2cef3f765faf16b7b8f11f373c1db1ee52198cfc1a28f82aaa70ebbc1ed9fc0d"} Mar 14 09:09:30 crc kubenswrapper[4956]: I0314 09:09:30.208713 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:30 crc kubenswrapper[4956]: I0314 09:09:30.209172 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:30 crc kubenswrapper[4956]: I0314 09:09:30.402129 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2vqmj"] Mar 14 09:09:30 crc kubenswrapper[4956]: I0314 09:09:30.509627 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" event={"ID":"50a4be30-e002-4c15-b3ef-b3048665261b","Type":"ContainerStarted","Data":"f57b9abc70a40f2a77809243699c53dbd952469986ff81bb6fbec064b82d46c9"} Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.208565 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.208630 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.209050 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.209244 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.437763 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-q6fzl"] Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.509107 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-259ww"] Mar 14 09:09:31 crc kubenswrapper[4956]: W0314 09:09:31.521035 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a78fe1_8b1a_4655_ad98_19c53622c2b1.slice/crio-139ffcbec69974c541ebeb904df5bd1be4e58aee79f8c3a66b325ad7fece18cf WatchSource:0}: Error finding container 139ffcbec69974c541ebeb904df5bd1be4e58aee79f8c3a66b325ad7fece18cf: Status 404 returned error can't find the container with id 139ffcbec69974c541ebeb904df5bd1be4e58aee79f8c3a66b325ad7fece18cf Mar 14 09:09:31 crc kubenswrapper[4956]: I0314 09:09:31.528674 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" event={"ID":"2dd43b3c-f9be-41a8-b1a6-11ab052283c7","Type":"ContainerStarted","Data":"2dbdd34f2ccee568d51282e528d00846487dc2d770a2df134af051e684dd4609"} Mar 14 09:09:32 crc kubenswrapper[4956]: I0314 09:09:32.208465 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:32 crc kubenswrapper[4956]: I0314 09:09:32.209254 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" Mar 14 09:09:32 crc kubenswrapper[4956]: I0314 09:09:32.489762 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw"] Mar 14 09:09:32 crc kubenswrapper[4956]: I0314 09:09:32.545893 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" event={"ID":"c888df81-af56-4ddd-8857-4952f199f288","Type":"ContainerStarted","Data":"3c4c9f7783190e99945bfd08a3827325462500764e3d84cf293cd2c7a06eb1c9"} Mar 14 09:09:32 crc kubenswrapper[4956]: I0314 09:09:32.553710 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" event={"ID":"c9a78fe1-8b1a-4655-ad98-19c53622c2b1","Type":"ContainerStarted","Data":"139ffcbec69974c541ebeb904df5bd1be4e58aee79f8c3a66b325ad7fece18cf"} Mar 14 09:09:39 crc kubenswrapper[4956]: I0314 09:09:39.238468 4956 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 09:09:40 crc kubenswrapper[4956]: I0314 09:09:40.402065 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qs67b" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.228638 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" event={"ID":"50a4be30-e002-4c15-b3ef-b3048665261b","Type":"ContainerStarted","Data":"188ae4ae6162903952a7506f236910a1faffa3eeee6a84d710c6ffe672528184"} Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.229229 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.229870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" event={"ID":"2dd43b3c-f9be-41a8-b1a6-11ab052283c7","Type":"ContainerStarted","Data":"ac1afad019b6e7e998a3b8ffb2585194b457e58d5dbfed455b592e7c8c52a8c5"} Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.230526 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.231397 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" event={"ID":"c888df81-af56-4ddd-8857-4952f199f288","Type":"ContainerStarted","Data":"897b8503aba922aeea91f2b0c4c4ca5f52cccc79d530af88fe6391aa9fe551b3"} Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.233582 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" event={"ID":"b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193","Type":"ContainerStarted","Data":"3ac9dbbedbc56b13a93812bc00af9bec3d1916398b42ed18b017607f2bbc3c33"} Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.235141 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" event={"ID":"c9a78fe1-8b1a-4655-ad98-19c53622c2b1","Type":"ContainerStarted","Data":"c02ed8f1732b182dad1bc34cd80b537b42d50e4794b2a96fea52dd112172448e"} Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.248078 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" podStartSLOduration=15.066121526 podStartE2EDuration="31.248065808s" podCreationTimestamp="2026-03-14 09:09:16 +0000 UTC" firstStartedPulling="2026-03-14 09:09:30.415070513 +0000 UTC m=+775.927762781" lastFinishedPulling="2026-03-14 09:09:46.597014795 +0000 UTC m=+792.109707063" observedRunningTime="2026-03-14 09:09:47.246531299 +0000 UTC m=+792.759223567" watchObservedRunningTime="2026-03-14 09:09:47.248065808 +0000 UTC m=+792.760758076" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.265527 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-259ww" podStartSLOduration=16.199358932 podStartE2EDuration="31.265513263s" podCreationTimestamp="2026-03-14 09:09:16 +0000 UTC" firstStartedPulling="2026-03-14 09:09:31.531353496 +0000 UTC m=+777.044045764" lastFinishedPulling="2026-03-14 09:09:46.597507827 +0000 UTC m=+792.110200095" observedRunningTime="2026-03-14 09:09:47.26060061 +0000 UTC m=+792.773292878" watchObservedRunningTime="2026-03-14 09:09:47.265513263 +0000 UTC m=+792.778205531" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.278010 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z" podStartSLOduration=14.092120635 podStartE2EDuration="31.277980474s" podCreationTimestamp="2026-03-14 09:09:16 +0000 UTC" firstStartedPulling="2026-03-14 09:09:29.422079338 +0000 UTC m=+774.934771606" lastFinishedPulling="2026-03-14 09:09:46.607939177 +0000 UTC m=+792.120631445" observedRunningTime="2026-03-14 09:09:47.277722407 +0000 UTC m=+792.790414675" watchObservedRunningTime="2026-03-14 09:09:47.277980474 +0000 UTC m=+792.790672742" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.282190 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.304336 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw" podStartSLOduration=17.25582734 podStartE2EDuration="31.304307611s" podCreationTimestamp="2026-03-14 09:09:16 +0000 UTC" firstStartedPulling="2026-03-14 09:09:32.532633767 +0000 UTC m=+778.045326035" lastFinishedPulling="2026-03-14 09:09:46.581114038 +0000 UTC m=+792.093806306" observedRunningTime="2026-03-14 09:09:47.299428889 +0000 UTC m=+792.812121157" watchObservedRunningTime="2026-03-14 09:09:47.304307611 +0000 UTC m=+792.816999879" Mar 14 09:09:47 crc kubenswrapper[4956]: I0314 09:09:47.323535 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-q6fzl" podStartSLOduration=16.156565484 podStartE2EDuration="31.32351013s" podCreationTimestamp="2026-03-14 09:09:16 +0000 UTC" firstStartedPulling="2026-03-14 09:09:31.478851566 +0000 UTC m=+776.991543834" lastFinishedPulling="2026-03-14 09:09:46.645796212 +0000 UTC m=+792.158488480" observedRunningTime="2026-03-14 09:09:47.318020633 +0000 UTC m=+792.830712901" watchObservedRunningTime="2026-03-14 09:09:47.32351013 +0000 UTC m=+792.836202398" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.793980 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r"] Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.795325 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.797017 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.808372 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r"] Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.833709 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.833826 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.833854 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx9h6\" (UniqueName: \"kubernetes.io/projected/75426cf5-0280-405c-a216-58ba481acb46-kube-api-access-fx9h6\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.934819 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.934883 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.934904 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx9h6\" (UniqueName: \"kubernetes.io/projected/75426cf5-0280-405c-a216-58ba481acb46-kube-api-access-fx9h6\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.935345 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.935410 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:52 crc kubenswrapper[4956]: I0314 09:09:52.951632 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx9h6\" (UniqueName: \"kubernetes.io/projected/75426cf5-0280-405c-a216-58ba481acb46-kube-api-access-fx9h6\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:53 crc kubenswrapper[4956]: I0314 09:09:53.109470 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:53 crc kubenswrapper[4956]: I0314 09:09:53.536797 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r"] Mar 14 09:09:53 crc kubenswrapper[4956]: W0314 09:09:53.545327 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75426cf5_0280_405c_a216_58ba481acb46.slice/crio-8a7951efd69bb3310e087d902aaaac943bad7919e0027f99a38344d6dd5b5562 WatchSource:0}: Error finding container 8a7951efd69bb3310e087d902aaaac943bad7919e0027f99a38344d6dd5b5562: Status 404 returned error can't find the container with id 8a7951efd69bb3310e087d902aaaac943bad7919e0027f99a38344d6dd5b5562 Mar 14 09:09:54 crc kubenswrapper[4956]: I0314 09:09:54.274126 4956 generic.go:334] "Generic (PLEG): container finished" podID="75426cf5-0280-405c-a216-58ba481acb46" containerID="06dd6c59e95c447dcd283c391c75c51056403ccca62279728e3c942902c567ca" exitCode=0 Mar 14 09:09:54 crc kubenswrapper[4956]: I0314 09:09:54.274201 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" event={"ID":"75426cf5-0280-405c-a216-58ba481acb46","Type":"ContainerDied","Data":"06dd6c59e95c447dcd283c391c75c51056403ccca62279728e3c942902c567ca"} Mar 14 09:09:54 crc kubenswrapper[4956]: I0314 09:09:54.274541 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" event={"ID":"75426cf5-0280-405c-a216-58ba481acb46","Type":"ContainerStarted","Data":"8a7951efd69bb3310e087d902aaaac943bad7919e0027f99a38344d6dd5b5562"} Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.161092 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9cxnh"] Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.163228 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.183293 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cxnh"] Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.364460 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjgp9\" (UniqueName: \"kubernetes.io/projected/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-kube-api-access-rjgp9\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.364688 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-catalog-content\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.364807 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-utilities\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.424289 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.424390 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.465437 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-utilities\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.465510 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjgp9\" (UniqueName: \"kubernetes.io/projected/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-kube-api-access-rjgp9\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.465553 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-catalog-content\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.465951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-utilities\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.466035 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-catalog-content\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.484274 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjgp9\" (UniqueName: \"kubernetes.io/projected/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-kube-api-access-rjgp9\") pod \"redhat-operators-9cxnh\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.499045 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:09:55 crc kubenswrapper[4956]: I0314 09:09:55.797559 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cxnh"] Mar 14 09:09:55 crc kubenswrapper[4956]: W0314 09:09:55.803650 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6b256c_b1b1_4c21_abed_51f7a51d39f3.slice/crio-f3b69efbe0f5b458f3ffca039d164c699d4fce1d483e2a54419193f3883af68a WatchSource:0}: Error finding container f3b69efbe0f5b458f3ffca039d164c699d4fce1d483e2a54419193f3883af68a: Status 404 returned error can't find the container with id f3b69efbe0f5b458f3ffca039d164c699d4fce1d483e2a54419193f3883af68a Mar 14 09:09:56 crc kubenswrapper[4956]: I0314 09:09:56.291681 4956 generic.go:334] "Generic (PLEG): container finished" podID="75426cf5-0280-405c-a216-58ba481acb46" containerID="0a2b977c8597585e35ff6593f4d289352a518cddc3477681d24530ab8b6b2526" exitCode=0 Mar 14 09:09:56 crc kubenswrapper[4956]: I0314 09:09:56.291770 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" event={"ID":"75426cf5-0280-405c-a216-58ba481acb46","Type":"ContainerDied","Data":"0a2b977c8597585e35ff6593f4d289352a518cddc3477681d24530ab8b6b2526"} Mar 14 09:09:56 crc kubenswrapper[4956]: I0314 09:09:56.293339 4956 generic.go:334] "Generic (PLEG): container finished" podID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerID="a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928" exitCode=0 Mar 14 09:09:56 crc kubenswrapper[4956]: I0314 09:09:56.293374 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerDied","Data":"a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928"} Mar 14 09:09:56 crc kubenswrapper[4956]: I0314 09:09:56.293397 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerStarted","Data":"f3b69efbe0f5b458f3ffca039d164c699d4fce1d483e2a54419193f3883af68a"} Mar 14 09:09:57 crc kubenswrapper[4956]: I0314 09:09:57.301203 4956 generic.go:334] "Generic (PLEG): container finished" podID="75426cf5-0280-405c-a216-58ba481acb46" containerID="46af917b4425e746fdb372f65fe00891564eb6fc35991d2faa229f973474ffda" exitCode=0 Mar 14 09:09:57 crc kubenswrapper[4956]: I0314 09:09:57.301267 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" event={"ID":"75426cf5-0280-405c-a216-58ba481acb46","Type":"ContainerDied","Data":"46af917b4425e746fdb372f65fe00891564eb6fc35991d2faa229f973474ffda"} Mar 14 09:09:57 crc kubenswrapper[4956]: I0314 09:09:57.302986 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerStarted","Data":"7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19"} Mar 14 09:09:57 crc kubenswrapper[4956]: I0314 09:09:57.340711 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-2vqmj" Mar 14 09:09:58 crc kubenswrapper[4956]: I0314 09:09:58.763917 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:58 crc kubenswrapper[4956]: I0314 09:09:58.914232 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-util\") pod \"75426cf5-0280-405c-a216-58ba481acb46\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " Mar 14 09:09:58 crc kubenswrapper[4956]: I0314 09:09:58.914370 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx9h6\" (UniqueName: \"kubernetes.io/projected/75426cf5-0280-405c-a216-58ba481acb46-kube-api-access-fx9h6\") pod \"75426cf5-0280-405c-a216-58ba481acb46\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " Mar 14 09:09:58 crc kubenswrapper[4956]: I0314 09:09:58.914408 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-bundle\") pod \"75426cf5-0280-405c-a216-58ba481acb46\" (UID: \"75426cf5-0280-405c-a216-58ba481acb46\") " Mar 14 09:09:58 crc kubenswrapper[4956]: I0314 09:09:58.915230 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-bundle" (OuterVolumeSpecName: "bundle") pod "75426cf5-0280-405c-a216-58ba481acb46" (UID: "75426cf5-0280-405c-a216-58ba481acb46"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:09:58 crc kubenswrapper[4956]: I0314 09:09:58.919955 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75426cf5-0280-405c-a216-58ba481acb46-kube-api-access-fx9h6" (OuterVolumeSpecName: "kube-api-access-fx9h6") pod "75426cf5-0280-405c-a216-58ba481acb46" (UID: "75426cf5-0280-405c-a216-58ba481acb46"). InnerVolumeSpecName "kube-api-access-fx9h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.016456 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx9h6\" (UniqueName: \"kubernetes.io/projected/75426cf5-0280-405c-a216-58ba481acb46-kube-api-access-fx9h6\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.016512 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.315099 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" event={"ID":"75426cf5-0280-405c-a216-58ba481acb46","Type":"ContainerDied","Data":"8a7951efd69bb3310e087d902aaaac943bad7919e0027f99a38344d6dd5b5562"} Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.315356 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a7951efd69bb3310e087d902aaaac943bad7919e0027f99a38344d6dd5b5562" Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.315139 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r" Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.316361 4956 generic.go:334] "Generic (PLEG): container finished" podID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerID="7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19" exitCode=0 Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.316399 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerDied","Data":"7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19"} Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.708022 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-util" (OuterVolumeSpecName: "util") pod "75426cf5-0280-405c-a216-58ba481acb46" (UID: "75426cf5-0280-405c-a216-58ba481acb46"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:09:59 crc kubenswrapper[4956]: I0314 09:09:59.722411 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75426cf5-0280-405c-a216-58ba481acb46-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.137116 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557990-s9vd6"] Mar 14 09:10:00 crc kubenswrapper[4956]: E0314 09:10:00.137403 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="pull" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.137415 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="pull" Mar 14 09:10:00 crc kubenswrapper[4956]: E0314 09:10:00.137429 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="util" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.137435 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="util" Mar 14 09:10:00 crc kubenswrapper[4956]: E0314 09:10:00.137446 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="extract" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.137452 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="extract" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.137590 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="75426cf5-0280-405c-a216-58ba481acb46" containerName="extract" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.138039 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.140224 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.140344 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.140431 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.187067 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-s9vd6"] Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.227472 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g6pd\" (UniqueName: \"kubernetes.io/projected/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2-kube-api-access-6g6pd\") pod \"auto-csr-approver-29557990-s9vd6\" (UID: \"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2\") " pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.330204 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g6pd\" (UniqueName: \"kubernetes.io/projected/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2-kube-api-access-6g6pd\") pod \"auto-csr-approver-29557990-s9vd6\" (UID: \"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2\") " pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.348929 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g6pd\" (UniqueName: \"kubernetes.io/projected/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2-kube-api-access-6g6pd\") pod \"auto-csr-approver-29557990-s9vd6\" (UID: \"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2\") " pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.463007 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:00 crc kubenswrapper[4956]: I0314 09:10:00.910845 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-s9vd6"] Mar 14 09:10:01 crc kubenswrapper[4956]: I0314 09:10:01.327886 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerStarted","Data":"910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef"} Mar 14 09:10:01 crc kubenswrapper[4956]: I0314 09:10:01.329044 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" event={"ID":"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2","Type":"ContainerStarted","Data":"fc89b2ea9887010a37fbf5f367631f97f94bf130deb3d94db5c88a2561e7680c"} Mar 14 09:10:01 crc kubenswrapper[4956]: I0314 09:10:01.371903 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9cxnh" podStartSLOduration=2.519068843 podStartE2EDuration="6.371883765s" podCreationTimestamp="2026-03-14 09:09:55 +0000 UTC" firstStartedPulling="2026-03-14 09:09:56.294490542 +0000 UTC m=+801.807182800" lastFinishedPulling="2026-03-14 09:10:00.147305454 +0000 UTC m=+805.659997722" observedRunningTime="2026-03-14 09:10:01.368716616 +0000 UTC m=+806.881408894" watchObservedRunningTime="2026-03-14 09:10:01.371883765 +0000 UTC m=+806.884576033" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.148821 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-scdwq"] Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.149686 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.153217 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.155774 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9cvxz" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.156629 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.163031 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jllmj\" (UniqueName: \"kubernetes.io/projected/01314a74-350e-49d3-a090-a924ac031589-kube-api-access-jllmj\") pod \"nmstate-operator-796d4cfff4-scdwq\" (UID: \"01314a74-350e-49d3-a090-a924ac031589\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.165640 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-scdwq"] Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.264219 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jllmj\" (UniqueName: \"kubernetes.io/projected/01314a74-350e-49d3-a090-a924ac031589-kube-api-access-jllmj\") pod \"nmstate-operator-796d4cfff4-scdwq\" (UID: \"01314a74-350e-49d3-a090-a924ac031589\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.287872 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jllmj\" (UniqueName: \"kubernetes.io/projected/01314a74-350e-49d3-a090-a924ac031589-kube-api-access-jllmj\") pod \"nmstate-operator-796d4cfff4-scdwq\" (UID: \"01314a74-350e-49d3-a090-a924ac031589\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.334553 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" event={"ID":"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2","Type":"ContainerStarted","Data":"3f273f4d984fda46ac4df9bbc1556168d89cc3926fa6db11b764ec4fed622546"} Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.352783 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" podStartSLOduration=1.2885108760000001 podStartE2EDuration="2.352761027s" podCreationTimestamp="2026-03-14 09:10:00 +0000 UTC" firstStartedPulling="2026-03-14 09:10:00.919096769 +0000 UTC m=+806.431789037" lastFinishedPulling="2026-03-14 09:10:01.98334692 +0000 UTC m=+807.496039188" observedRunningTime="2026-03-14 09:10:02.347499205 +0000 UTC m=+807.860191473" watchObservedRunningTime="2026-03-14 09:10:02.352761027 +0000 UTC m=+807.865453295" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.466935 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" Mar 14 09:10:02 crc kubenswrapper[4956]: I0314 09:10:02.761208 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-scdwq"] Mar 14 09:10:02 crc kubenswrapper[4956]: W0314 09:10:02.770558 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01314a74_350e_49d3_a090_a924ac031589.slice/crio-a36f2aa90be4f12a72365befdf5c4e4482e515bb30f27293f83f69124d08b8c9 WatchSource:0}: Error finding container a36f2aa90be4f12a72365befdf5c4e4482e515bb30f27293f83f69124d08b8c9: Status 404 returned error can't find the container with id a36f2aa90be4f12a72365befdf5c4e4482e515bb30f27293f83f69124d08b8c9 Mar 14 09:10:03 crc kubenswrapper[4956]: I0314 09:10:03.342607 4956 generic.go:334] "Generic (PLEG): container finished" podID="3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2" containerID="3f273f4d984fda46ac4df9bbc1556168d89cc3926fa6db11b764ec4fed622546" exitCode=0 Mar 14 09:10:03 crc kubenswrapper[4956]: I0314 09:10:03.342739 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" event={"ID":"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2","Type":"ContainerDied","Data":"3f273f4d984fda46ac4df9bbc1556168d89cc3926fa6db11b764ec4fed622546"} Mar 14 09:10:03 crc kubenswrapper[4956]: I0314 09:10:03.343740 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" event={"ID":"01314a74-350e-49d3-a090-a924ac031589","Type":"ContainerStarted","Data":"a36f2aa90be4f12a72365befdf5c4e4482e515bb30f27293f83f69124d08b8c9"} Mar 14 09:10:04 crc kubenswrapper[4956]: I0314 09:10:04.619604 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:04 crc kubenswrapper[4956]: I0314 09:10:04.801859 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6pd\" (UniqueName: \"kubernetes.io/projected/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2-kube-api-access-6g6pd\") pod \"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2\" (UID: \"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2\") " Mar 14 09:10:04 crc kubenswrapper[4956]: I0314 09:10:04.813426 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2-kube-api-access-6g6pd" (OuterVolumeSpecName: "kube-api-access-6g6pd") pod "3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2" (UID: "3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2"). InnerVolumeSpecName "kube-api-access-6g6pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:04 crc kubenswrapper[4956]: I0314 09:10:04.903264 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6pd\" (UniqueName: \"kubernetes.io/projected/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2-kube-api-access-6g6pd\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.356587 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" event={"ID":"3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2","Type":"ContainerDied","Data":"fc89b2ea9887010a37fbf5f367631f97f94bf130deb3d94db5c88a2561e7680c"} Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.356630 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc89b2ea9887010a37fbf5f367631f97f94bf130deb3d94db5c88a2561e7680c" Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.356695 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-s9vd6" Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.400155 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-z8grt"] Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.403975 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-z8grt"] Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.499635 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:10:05 crc kubenswrapper[4956]: I0314 09:10:05.499687 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:10:06 crc kubenswrapper[4956]: I0314 09:10:06.363086 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" event={"ID":"01314a74-350e-49d3-a090-a924ac031589","Type":"ContainerStarted","Data":"85346373401a9cc3ffaf6ebfd3f9272a0e98fbc5748b681bcba227f231f4fa1d"} Mar 14 09:10:06 crc kubenswrapper[4956]: I0314 09:10:06.379507 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-scdwq" podStartSLOduration=1.638609387 podStartE2EDuration="4.379476097s" podCreationTimestamp="2026-03-14 09:10:02 +0000 UTC" firstStartedPulling="2026-03-14 09:10:02.773150495 +0000 UTC m=+808.285842763" lastFinishedPulling="2026-03-14 09:10:05.514017205 +0000 UTC m=+811.026709473" observedRunningTime="2026-03-14 09:10:06.378086102 +0000 UTC m=+811.890778380" watchObservedRunningTime="2026-03-14 09:10:06.379476097 +0000 UTC m=+811.892168365" Mar 14 09:10:06 crc kubenswrapper[4956]: I0314 09:10:06.540228 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9cxnh" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="registry-server" probeResult="failure" output=< Mar 14 09:10:06 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:10:06 crc kubenswrapper[4956]: > Mar 14 09:10:07 crc kubenswrapper[4956]: I0314 09:10:07.217599 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83e5d5f-ebdb-40bd-bede-617a7edba9e0" path="/var/lib/kubelet/pods/a83e5d5f-ebdb-40bd-bede-617a7edba9e0/volumes" Mar 14 09:10:15 crc kubenswrapper[4956]: I0314 09:10:15.561466 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:10:15 crc kubenswrapper[4956]: I0314 09:10:15.601840 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:10:15 crc kubenswrapper[4956]: I0314 09:10:15.794335 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9cxnh"] Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.428188 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9cxnh" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="registry-server" containerID="cri-o://910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef" gracePeriod=2 Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.757513 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.953841 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-catalog-content\") pod \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.954222 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-utilities\") pod \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.954381 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjgp9\" (UniqueName: \"kubernetes.io/projected/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-kube-api-access-rjgp9\") pod \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\" (UID: \"4c6b256c-b1b1-4c21-abed-51f7a51d39f3\") " Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.955215 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-utilities" (OuterVolumeSpecName: "utilities") pod "4c6b256c-b1b1-4c21-abed-51f7a51d39f3" (UID: "4c6b256c-b1b1-4c21-abed-51f7a51d39f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:17 crc kubenswrapper[4956]: I0314 09:10:17.965014 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-kube-api-access-rjgp9" (OuterVolumeSpecName: "kube-api-access-rjgp9") pod "4c6b256c-b1b1-4c21-abed-51f7a51d39f3" (UID: "4c6b256c-b1b1-4c21-abed-51f7a51d39f3"). InnerVolumeSpecName "kube-api-access-rjgp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.055554 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.055583 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjgp9\" (UniqueName: \"kubernetes.io/projected/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-kube-api-access-rjgp9\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.071452 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c6b256c-b1b1-4c21-abed-51f7a51d39f3" (UID: "4c6b256c-b1b1-4c21-abed-51f7a51d39f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.156764 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b256c-b1b1-4c21-abed-51f7a51d39f3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168536 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs"] Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.168732 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="extract-content" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168742 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="extract-content" Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.168756 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2" containerName="oc" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168762 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2" containerName="oc" Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.168774 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="registry-server" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168781 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="registry-server" Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.168797 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="extract-utilities" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168803 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="extract-utilities" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168889 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2" containerName="oc" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.168901 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerName="registry-server" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.169421 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.170852 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5kz8q" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.175504 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.176344 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.178074 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.182008 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.223665 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.259357 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jz8xl"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.261696 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.326346 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.326984 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.331593 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vfnr2" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.331643 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.331834 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.353281 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.361172 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdt5r\" (UniqueName: \"kubernetes.io/projected/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-kube-api-access-fdt5r\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.361238 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wgls\" (UniqueName: \"kubernetes.io/projected/6c93f252-354d-484f-bc01-705766542dd9-kube-api-access-4wgls\") pod \"nmstate-metrics-9b8c8685d-wggcs\" (UID: \"6c93f252-354d-484f-bc01-705766542dd9\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.361305 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.440240 4956 generic.go:334] "Generic (PLEG): container finished" podID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" containerID="910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef" exitCode=0 Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.440284 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerDied","Data":"910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef"} Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.440309 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cxnh" event={"ID":"4c6b256c-b1b1-4c21-abed-51f7a51d39f3","Type":"ContainerDied","Data":"f3b69efbe0f5b458f3ffca039d164c699d4fce1d483e2a54419193f3883af68a"} Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.440326 4956 scope.go:117] "RemoveContainer" containerID="910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.440453 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cxnh" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462043 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wgls\" (UniqueName: \"kubernetes.io/projected/6c93f252-354d-484f-bc01-705766542dd9-kube-api-access-4wgls\") pod \"nmstate-metrics-9b8c8685d-wggcs\" (UID: \"6c93f252-354d-484f-bc01-705766542dd9\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462115 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462152 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/986e9786-a2ab-4b67-8b7c-923951ef0928-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462190 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-ovs-socket\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462222 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdt5r\" (UniqueName: \"kubernetes.io/projected/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-kube-api-access-fdt5r\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462245 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5cgd\" (UniqueName: \"kubernetes.io/projected/986e9786-a2ab-4b67-8b7c-923951ef0928-kube-api-access-p5cgd\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462271 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-nmstate-lock\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462312 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-dbus-socket\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462344 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/986e9786-a2ab-4b67-8b7c-923951ef0928-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.462371 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbmq\" (UniqueName: \"kubernetes.io/projected/b0216839-cf58-4b58-8156-531abde2cbd1-kube-api-access-gfbmq\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.462521 4956 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.462571 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-tls-key-pair podName:e5ccd04c-58e5-49c2-9667-8cdaa37dd381 nodeName:}" failed. No retries permitted until 2026-03-14 09:10:18.962552154 +0000 UTC m=+824.475244422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-tls-key-pair") pod "nmstate-webhook-5f558f5558-wdzfb" (UID: "e5ccd04c-58e5-49c2-9667-8cdaa37dd381") : secret "openshift-nmstate-webhook" not found Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.475357 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9cxnh"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.475550 4956 scope.go:117] "RemoveContainer" containerID="7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.480331 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9cxnh"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.482783 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wgls\" (UniqueName: \"kubernetes.io/projected/6c93f252-354d-484f-bc01-705766542dd9-kube-api-access-4wgls\") pod \"nmstate-metrics-9b8c8685d-wggcs\" (UID: \"6c93f252-354d-484f-bc01-705766542dd9\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.489809 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdt5r\" (UniqueName: \"kubernetes.io/projected/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-kube-api-access-fdt5r\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.492219 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.535837 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-757d47dbcd-vwrnb"] Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.536589 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563421 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/986e9786-a2ab-4b67-8b7c-923951ef0928-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563466 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-ovs-socket\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563521 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5cgd\" (UniqueName: \"kubernetes.io/projected/986e9786-a2ab-4b67-8b7c-923951ef0928-kube-api-access-p5cgd\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563540 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-nmstate-lock\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563558 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-dbus-socket\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563582 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/986e9786-a2ab-4b67-8b7c-923951ef0928-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.563603 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbmq\" (UniqueName: \"kubernetes.io/projected/b0216839-cf58-4b58-8156-531abde2cbd1-kube-api-access-gfbmq\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.564593 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/986e9786-a2ab-4b67-8b7c-923951ef0928-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.564640 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-ovs-socket\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.564793 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-nmstate-lock\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.565016 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b0216839-cf58-4b58-8156-531abde2cbd1-dbus-socket\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.565072 4956 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 14 09:10:18 crc kubenswrapper[4956]: E0314 09:10:18.565107 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/986e9786-a2ab-4b67-8b7c-923951ef0928-plugin-serving-cert podName:986e9786-a2ab-4b67-8b7c-923951ef0928 nodeName:}" failed. No retries permitted until 2026-03-14 09:10:19.065095582 +0000 UTC m=+824.577787850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/986e9786-a2ab-4b67-8b7c-923951ef0928-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-6w69h" (UID: "986e9786-a2ab-4b67-8b7c-923951ef0928") : secret "plugin-serving-cert" not found Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.585498 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5cgd\" (UniqueName: \"kubernetes.io/projected/986e9786-a2ab-4b67-8b7c-923951ef0928-kube-api-access-p5cgd\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.591533 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbmq\" (UniqueName: \"kubernetes.io/projected/b0216839-cf58-4b58-8156-531abde2cbd1-kube-api-access-gfbmq\") pod \"nmstate-handler-jz8xl\" (UID: \"b0216839-cf58-4b58-8156-531abde2cbd1\") " pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.624892 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665076 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-serving-cert\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-oauth-config\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665213 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9dc\" (UniqueName: \"kubernetes.io/projected/d97bedae-1c06-4fac-8861-a44f69574365-kube-api-access-nw9dc\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665230 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-service-ca\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665249 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-trusted-ca-bundle\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665263 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-console-config\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.665280 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-oauth-serving-cert\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.765968 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-oauth-config\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.766064 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-service-ca\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.766089 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-trusted-ca-bundle\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.766108 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9dc\" (UniqueName: \"kubernetes.io/projected/d97bedae-1c06-4fac-8861-a44f69574365-kube-api-access-nw9dc\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.766130 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-console-config\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.766159 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-oauth-serving-cert\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.766188 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-serving-cert\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.776984 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-oauth-config\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.777608 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-service-ca\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.778267 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-trusted-ca-bundle\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.778990 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-console-config\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.779447 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-oauth-serving-cert\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.779855 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-serving-cert\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.793524 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9dc\" (UniqueName: \"kubernetes.io/projected/d97bedae-1c06-4fac-8861-a44f69574365-kube-api-access-nw9dc\") pod \"console-757d47dbcd-vwrnb\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.936429 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.950687 4956 scope.go:117] "RemoveContainer" containerID="a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.969368 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.982201 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5ccd04c-58e5-49c2-9667-8cdaa37dd381-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-wdzfb\" (UID: \"e5ccd04c-58e5-49c2-9667-8cdaa37dd381\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:18 crc kubenswrapper[4956]: I0314 09:10:18.985287 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-757d47dbcd-vwrnb"] Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.028301 4956 scope.go:117] "RemoveContainer" containerID="910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef" Mar 14 09:10:19 crc kubenswrapper[4956]: E0314 09:10:19.028794 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef\": container with ID starting with 910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef not found: ID does not exist" containerID="910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.028845 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef"} err="failed to get container status \"910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef\": rpc error: code = NotFound desc = could not find container \"910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef\": container with ID starting with 910e717642884d88dc7fb7d3d30f4fdfa316851eaf5c13a89f8add6f3de8cbef not found: ID does not exist" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.028881 4956 scope.go:117] "RemoveContainer" containerID="7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19" Mar 14 09:10:19 crc kubenswrapper[4956]: E0314 09:10:19.029378 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19\": container with ID starting with 7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19 not found: ID does not exist" containerID="7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.029403 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19"} err="failed to get container status \"7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19\": rpc error: code = NotFound desc = could not find container \"7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19\": container with ID starting with 7a5dec72ca4a1d3fecf20e0f3c97c1b89120fdf6fab16e039f615301f7c1de19 not found: ID does not exist" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.029421 4956 scope.go:117] "RemoveContainer" containerID="a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928" Mar 14 09:10:19 crc kubenswrapper[4956]: E0314 09:10:19.029763 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928\": container with ID starting with a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928 not found: ID does not exist" containerID="a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.029851 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928"} err="failed to get container status \"a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928\": rpc error: code = NotFound desc = could not find container \"a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928\": container with ID starting with a0ece9f0490f54339f8e550ea55efb85b586ca36b72130e69286710b74f06928 not found: ID does not exist" Mar 14 09:10:19 crc kubenswrapper[4956]: W0314 09:10:19.035089 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0216839_cf58_4b58_8156_531abde2cbd1.slice/crio-9db21d43c5c98ebe14120c793a6d4498599a0096d9d7bf43da0bd7f602f0a3e9 WatchSource:0}: Error finding container 9db21d43c5c98ebe14120c793a6d4498599a0096d9d7bf43da0bd7f602f0a3e9: Status 404 returned error can't find the container with id 9db21d43c5c98ebe14120c793a6d4498599a0096d9d7bf43da0bd7f602f0a3e9 Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.071039 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/986e9786-a2ab-4b67-8b7c-923951ef0928-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.076266 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/986e9786-a2ab-4b67-8b7c-923951ef0928-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6w69h\" (UID: \"986e9786-a2ab-4b67-8b7c-923951ef0928\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.084811 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs"] Mar 14 09:10:19 crc kubenswrapper[4956]: W0314 09:10:19.093516 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c93f252_354d_484f_bc01_705766542dd9.slice/crio-69303d755f7365a4c5516d107960c099d2a74ba1a8876995b42deb561ba8c4a3 WatchSource:0}: Error finding container 69303d755f7365a4c5516d107960c099d2a74ba1a8876995b42deb561ba8c4a3: Status 404 returned error can't find the container with id 69303d755f7365a4c5516d107960c099d2a74ba1a8876995b42deb561ba8c4a3 Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.101464 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:19 crc kubenswrapper[4956]: W0314 09:10:19.214021 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97bedae_1c06_4fac_8861_a44f69574365.slice/crio-195f679bb058cd7bb2aad41a5684358fea7d4153c9bc296c87677c048b642b2e WatchSource:0}: Error finding container 195f679bb058cd7bb2aad41a5684358fea7d4153c9bc296c87677c048b642b2e: Status 404 returned error can't find the container with id 195f679bb058cd7bb2aad41a5684358fea7d4153c9bc296c87677c048b642b2e Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.226960 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6b256c-b1b1-4c21-abed-51f7a51d39f3" path="/var/lib/kubelet/pods/4c6b256c-b1b1-4c21-abed-51f7a51d39f3/volumes" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.230666 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-757d47dbcd-vwrnb"] Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.244546 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.283265 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb"] Mar 14 09:10:19 crc kubenswrapper[4956]: W0314 09:10:19.295276 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ccd04c_58e5_49c2_9667_8cdaa37dd381.slice/crio-6f9ebcff76d84ea061029bc4461e91a1a2aeb3d0a83d0469d0a38a68dc12bdf7 WatchSource:0}: Error finding container 6f9ebcff76d84ea061029bc4461e91a1a2aeb3d0a83d0469d0a38a68dc12bdf7: Status 404 returned error can't find the container with id 6f9ebcff76d84ea061029bc4461e91a1a2aeb3d0a83d0469d0a38a68dc12bdf7 Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.404155 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h"] Mar 14 09:10:19 crc kubenswrapper[4956]: W0314 09:10:19.407425 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986e9786_a2ab_4b67_8b7c_923951ef0928.slice/crio-e21f7a56a3a7da56d1a6ee4af39789bce03d64f395fe321522090ef0ba180957 WatchSource:0}: Error finding container e21f7a56a3a7da56d1a6ee4af39789bce03d64f395fe321522090ef0ba180957: Status 404 returned error can't find the container with id e21f7a56a3a7da56d1a6ee4af39789bce03d64f395fe321522090ef0ba180957 Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.446918 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" event={"ID":"e5ccd04c-58e5-49c2-9667-8cdaa37dd381","Type":"ContainerStarted","Data":"6f9ebcff76d84ea061029bc4461e91a1a2aeb3d0a83d0469d0a38a68dc12bdf7"} Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.447744 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jz8xl" event={"ID":"b0216839-cf58-4b58-8156-531abde2cbd1","Type":"ContainerStarted","Data":"9db21d43c5c98ebe14120c793a6d4498599a0096d9d7bf43da0bd7f602f0a3e9"} Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.450111 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" event={"ID":"6c93f252-354d-484f-bc01-705766542dd9","Type":"ContainerStarted","Data":"69303d755f7365a4c5516d107960c099d2a74ba1a8876995b42deb561ba8c4a3"} Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.451918 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-757d47dbcd-vwrnb" event={"ID":"d97bedae-1c06-4fac-8861-a44f69574365","Type":"ContainerStarted","Data":"1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217"} Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.451954 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-757d47dbcd-vwrnb" event={"ID":"d97bedae-1c06-4fac-8861-a44f69574365","Type":"ContainerStarted","Data":"195f679bb058cd7bb2aad41a5684358fea7d4153c9bc296c87677c048b642b2e"} Mar 14 09:10:19 crc kubenswrapper[4956]: I0314 09:10:19.452783 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" event={"ID":"986e9786-a2ab-4b67-8b7c-923951ef0928","Type":"ContainerStarted","Data":"e21f7a56a3a7da56d1a6ee4af39789bce03d64f395fe321522090ef0ba180957"} Mar 14 09:10:20 crc kubenswrapper[4956]: I0314 09:10:20.475882 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-757d47dbcd-vwrnb" podStartSLOduration=2.475858512 podStartE2EDuration="2.475858512s" podCreationTimestamp="2026-03-14 09:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:10:20.470867748 +0000 UTC m=+825.983560016" watchObservedRunningTime="2026-03-14 09:10:20.475858512 +0000 UTC m=+825.988550780" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.423568 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.424175 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.424228 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.424918 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a47c6c00ee731cd8c4ec91df013e1936bd43fead85afc21e044951f9ea4d95f7"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.424976 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://a47c6c00ee731cd8c4ec91df013e1936bd43fead85afc21e044951f9ea4d95f7" gracePeriod=600 Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.496233 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" event={"ID":"e5ccd04c-58e5-49c2-9667-8cdaa37dd381","Type":"ContainerStarted","Data":"b0b7d54e12f675bdc1da55bb1a0e284eec4f7fe1d2d13c2dbd395ed30aea0929"} Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.496356 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.499091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jz8xl" event={"ID":"b0216839-cf58-4b58-8156-531abde2cbd1","Type":"ContainerStarted","Data":"ae07c071f7dbccf863c491dff8682eb3ef4d6df763b4984a94f809e94b7c7ba1"} Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.499124 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.519156 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" event={"ID":"6c93f252-354d-484f-bc01-705766542dd9","Type":"ContainerStarted","Data":"94237aaf684595a01f66758c56021cd09d6fda7076092bfb7e6a6a3d19ad0b3e"} Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.535003 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" podStartSLOduration=2.067981916 podStartE2EDuration="7.534980651s" podCreationTimestamp="2026-03-14 09:10:18 +0000 UTC" firstStartedPulling="2026-03-14 09:10:19.298072568 +0000 UTC m=+824.810764836" lastFinishedPulling="2026-03-14 09:10:24.765071293 +0000 UTC m=+830.277763571" observedRunningTime="2026-03-14 09:10:25.518673684 +0000 UTC m=+831.031365962" watchObservedRunningTime="2026-03-14 09:10:25.534980651 +0000 UTC m=+831.047672919" Mar 14 09:10:25 crc kubenswrapper[4956]: I0314 09:10:25.538563 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jz8xl" podStartSLOduration=1.833951058 podStartE2EDuration="7.53854063s" podCreationTimestamp="2026-03-14 09:10:18 +0000 UTC" firstStartedPulling="2026-03-14 09:10:19.037382944 +0000 UTC m=+824.550075212" lastFinishedPulling="2026-03-14 09:10:24.741972506 +0000 UTC m=+830.254664784" observedRunningTime="2026-03-14 09:10:25.536177921 +0000 UTC m=+831.048870189" watchObservedRunningTime="2026-03-14 09:10:25.53854063 +0000 UTC m=+831.051232898" Mar 14 09:10:26 crc kubenswrapper[4956]: I0314 09:10:26.528324 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="a47c6c00ee731cd8c4ec91df013e1936bd43fead85afc21e044951f9ea4d95f7" exitCode=0 Mar 14 09:10:26 crc kubenswrapper[4956]: I0314 09:10:26.528396 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"a47c6c00ee731cd8c4ec91df013e1936bd43fead85afc21e044951f9ea4d95f7"} Mar 14 09:10:26 crc kubenswrapper[4956]: I0314 09:10:26.528673 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"c0530da8ab6e9909827338d4f4090fb9eca5500f5a446223956b17c49ce8aec4"} Mar 14 09:10:26 crc kubenswrapper[4956]: I0314 09:10:26.528701 4956 scope.go:117] "RemoveContainer" containerID="f3de809325f5e89aa2de74df02911d0fee80e431b2a2454fac310773a12e5f0a" Mar 14 09:10:27 crc kubenswrapper[4956]: I0314 09:10:27.535579 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" event={"ID":"986e9786-a2ab-4b67-8b7c-923951ef0928","Type":"ContainerStarted","Data":"b28bcb4554f70114e12555ae3eecce2d00d7582dc2e94b8f55b8c2759c885ba9"} Mar 14 09:10:27 crc kubenswrapper[4956]: I0314 09:10:27.549648 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6w69h" podStartSLOduration=1.748741772 podStartE2EDuration="9.549631033s" podCreationTimestamp="2026-03-14 09:10:18 +0000 UTC" firstStartedPulling="2026-03-14 09:10:19.409457977 +0000 UTC m=+824.922150245" lastFinishedPulling="2026-03-14 09:10:27.210347238 +0000 UTC m=+832.723039506" observedRunningTime="2026-03-14 09:10:27.547998122 +0000 UTC m=+833.060690400" watchObservedRunningTime="2026-03-14 09:10:27.549631033 +0000 UTC m=+833.062323301" Mar 14 09:10:28 crc kubenswrapper[4956]: I0314 09:10:28.938032 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:28 crc kubenswrapper[4956]: I0314 09:10:28.938402 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:28 crc kubenswrapper[4956]: I0314 09:10:28.942735 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:29 crc kubenswrapper[4956]: I0314 09:10:29.560564 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" event={"ID":"6c93f252-354d-484f-bc01-705766542dd9","Type":"ContainerStarted","Data":"9db930a747b3fd7ad3fc5228407c2d2887419628add9e0d142880a6a00f2589a"} Mar 14 09:10:29 crc kubenswrapper[4956]: I0314 09:10:29.566292 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:10:29 crc kubenswrapper[4956]: I0314 09:10:29.584988 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wggcs" podStartSLOduration=2.137145011 podStartE2EDuration="11.584967451s" podCreationTimestamp="2026-03-14 09:10:18 +0000 UTC" firstStartedPulling="2026-03-14 09:10:19.100705404 +0000 UTC m=+824.613397672" lastFinishedPulling="2026-03-14 09:10:28.548527834 +0000 UTC m=+834.061220112" observedRunningTime="2026-03-14 09:10:29.581015772 +0000 UTC m=+835.093708050" watchObservedRunningTime="2026-03-14 09:10:29.584967451 +0000 UTC m=+835.097659719" Mar 14 09:10:29 crc kubenswrapper[4956]: I0314 09:10:29.638616 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8qng2"] Mar 14 09:10:33 crc kubenswrapper[4956]: I0314 09:10:33.647432 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jz8xl" Mar 14 09:10:39 crc kubenswrapper[4956]: I0314 09:10:39.106463 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-wdzfb" Mar 14 09:10:46 crc kubenswrapper[4956]: I0314 09:10:46.542983 4956 scope.go:117] "RemoveContainer" containerID="5b6822127eb6caa549ad5bb7e7435c50a72e3a7ca443ba687b27be5a6325037a" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.580877 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6"] Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.582847 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.586697 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.590472 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6"] Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.627907 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.627965 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfsh\" (UniqueName: \"kubernetes.io/projected/37548b9f-0521-4b58-a42a-a023fe66022b-kube-api-access-jkfsh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.627985 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.729668 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.730273 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.730357 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfsh\" (UniqueName: \"kubernetes.io/projected/37548b9f-0521-4b58-a42a-a023fe66022b-kube-api-access-jkfsh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.730437 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.731231 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.750423 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfsh\" (UniqueName: \"kubernetes.io/projected/37548b9f-0521-4b58-a42a-a023fe66022b-kube-api-access-jkfsh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:53 crc kubenswrapper[4956]: I0314 09:10:53.908032 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:54 crc kubenswrapper[4956]: I0314 09:10:54.331429 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6"] Mar 14 09:10:54 crc kubenswrapper[4956]: I0314 09:10:54.681422 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8qng2" podUID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" containerName="console" containerID="cri-o://bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a" gracePeriod=15 Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.020656 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8qng2_d7ce6871-ad8a-4ad9-b804-f3931e0f60ae/console/0.log" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.020981 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.027421 4956 generic.go:334] "Generic (PLEG): container finished" podID="37548b9f-0521-4b58-a42a-a023fe66022b" containerID="1806ee0c55cd18112ca6038d202fa87d4fa28c72f8ef7d639d5afe040fe3e4e1" exitCode=0 Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.027534 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" event={"ID":"37548b9f-0521-4b58-a42a-a023fe66022b","Type":"ContainerDied","Data":"1806ee0c55cd18112ca6038d202fa87d4fa28c72f8ef7d639d5afe040fe3e4e1"} Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.027576 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" event={"ID":"37548b9f-0521-4b58-a42a-a023fe66022b","Type":"ContainerStarted","Data":"3c6853712c0a5e9a9989dd63a2bfb1c1b3f3b02ec49d15efcd51e325c2f0d88f"} Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.028772 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8qng2_d7ce6871-ad8a-4ad9-b804-f3931e0f60ae/console/0.log" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.028815 4956 generic.go:334] "Generic (PLEG): container finished" podID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" containerID="bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a" exitCode=2 Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.028838 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8qng2" event={"ID":"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae","Type":"ContainerDied","Data":"bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a"} Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.028856 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8qng2" event={"ID":"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae","Type":"ContainerDied","Data":"da001f9e097418bf6d812ac9f91b8a6dfa1c8d5680dd4c0738389f39a7393596"} Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.028875 4956 scope.go:117] "RemoveContainer" containerID="bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.028999 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8qng2" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.052921 4956 scope.go:117] "RemoveContainer" containerID="bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a" Mar 14 09:10:55 crc kubenswrapper[4956]: E0314 09:10:55.053275 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a\": container with ID starting with bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a not found: ID does not exist" containerID="bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.053299 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a"} err="failed to get container status \"bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a\": rpc error: code = NotFound desc = could not find container \"bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a\": container with ID starting with bc100bbf32be076761f698905be28c2aa4060c07026617be61281a8c18ed816a not found: ID does not exist" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.145862 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-trusted-ca-bundle\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.145916 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-serving-cert\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.145966 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v95dt\" (UniqueName: \"kubernetes.io/projected/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-kube-api-access-v95dt\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.146005 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-config\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.146031 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-service-ca\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.146083 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-oauth-config\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.146107 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-oauth-serving-cert\") pod \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\" (UID: \"d7ce6871-ad8a-4ad9-b804-f3931e0f60ae\") " Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.146677 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.146799 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.147211 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-config" (OuterVolumeSpecName: "console-config") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.147378 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.151819 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.153692 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.156121 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-kube-api-access-v95dt" (OuterVolumeSpecName: "kube-api-access-v95dt") pod "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" (UID: "d7ce6871-ad8a-4ad9-b804-f3931e0f60ae"). InnerVolumeSpecName "kube-api-access-v95dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247622 4956 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247649 4956 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247658 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247667 4956 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247676 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v95dt\" (UniqueName: \"kubernetes.io/projected/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-kube-api-access-v95dt\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247688 4956 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.247698 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.352367 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8qng2"] Mar 14 09:10:55 crc kubenswrapper[4956]: I0314 09:10:55.360576 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8qng2"] Mar 14 09:10:57 crc kubenswrapper[4956]: I0314 09:10:57.047090 4956 generic.go:334] "Generic (PLEG): container finished" podID="37548b9f-0521-4b58-a42a-a023fe66022b" containerID="d347f254c5b56be2df20e3ab7c44db38af0916fe51bb3b3fdb4eb248bd6176a1" exitCode=0 Mar 14 09:10:57 crc kubenswrapper[4956]: I0314 09:10:57.047157 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" event={"ID":"37548b9f-0521-4b58-a42a-a023fe66022b","Type":"ContainerDied","Data":"d347f254c5b56be2df20e3ab7c44db38af0916fe51bb3b3fdb4eb248bd6176a1"} Mar 14 09:10:57 crc kubenswrapper[4956]: I0314 09:10:57.219708 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" path="/var/lib/kubelet/pods/d7ce6871-ad8a-4ad9-b804-f3931e0f60ae/volumes" Mar 14 09:10:58 crc kubenswrapper[4956]: I0314 09:10:58.054203 4956 generic.go:334] "Generic (PLEG): container finished" podID="37548b9f-0521-4b58-a42a-a023fe66022b" containerID="1d130af5a4e0e177367e51ae8e885f5089b3ed3dc5c02351edd4e3a9f47db1a2" exitCode=0 Mar 14 09:10:58 crc kubenswrapper[4956]: I0314 09:10:58.054291 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" event={"ID":"37548b9f-0521-4b58-a42a-a023fe66022b","Type":"ContainerDied","Data":"1d130af5a4e0e177367e51ae8e885f5089b3ed3dc5c02351edd4e3a9f47db1a2"} Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.291131 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.408132 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-util\") pod \"37548b9f-0521-4b58-a42a-a023fe66022b\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.408230 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkfsh\" (UniqueName: \"kubernetes.io/projected/37548b9f-0521-4b58-a42a-a023fe66022b-kube-api-access-jkfsh\") pod \"37548b9f-0521-4b58-a42a-a023fe66022b\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.408294 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-bundle\") pod \"37548b9f-0521-4b58-a42a-a023fe66022b\" (UID: \"37548b9f-0521-4b58-a42a-a023fe66022b\") " Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.409588 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-bundle" (OuterVolumeSpecName: "bundle") pod "37548b9f-0521-4b58-a42a-a023fe66022b" (UID: "37548b9f-0521-4b58-a42a-a023fe66022b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.415223 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37548b9f-0521-4b58-a42a-a023fe66022b-kube-api-access-jkfsh" (OuterVolumeSpecName: "kube-api-access-jkfsh") pod "37548b9f-0521-4b58-a42a-a023fe66022b" (UID: "37548b9f-0521-4b58-a42a-a023fe66022b"). InnerVolumeSpecName "kube-api-access-jkfsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.421802 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-util" (OuterVolumeSpecName: "util") pod "37548b9f-0521-4b58-a42a-a023fe66022b" (UID: "37548b9f-0521-4b58-a42a-a023fe66022b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.509122 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.509160 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkfsh\" (UniqueName: \"kubernetes.io/projected/37548b9f-0521-4b58-a42a-a023fe66022b-kube-api-access-jkfsh\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:59 crc kubenswrapper[4956]: I0314 09:10:59.509172 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37548b9f-0521-4b58-a42a-a023fe66022b-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:00 crc kubenswrapper[4956]: I0314 09:11:00.068857 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" event={"ID":"37548b9f-0521-4b58-a42a-a023fe66022b","Type":"ContainerDied","Data":"3c6853712c0a5e9a9989dd63a2bfb1c1b3f3b02ec49d15efcd51e325c2f0d88f"} Mar 14 09:11:00 crc kubenswrapper[4956]: I0314 09:11:00.068905 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6853712c0a5e9a9989dd63a2bfb1c1b3f3b02ec49d15efcd51e325c2f0d88f" Mar 14 09:11:00 crc kubenswrapper[4956]: I0314 09:11:00.068906 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.761228 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt"] Mar 14 09:11:08 crc kubenswrapper[4956]: E0314 09:11:08.762077 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" containerName="console" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.762096 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" containerName="console" Mar 14 09:11:08 crc kubenswrapper[4956]: E0314 09:11:08.762122 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="pull" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.762132 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="pull" Mar 14 09:11:08 crc kubenswrapper[4956]: E0314 09:11:08.762149 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="util" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.762156 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="util" Mar 14 09:11:08 crc kubenswrapper[4956]: E0314 09:11:08.762170 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="extract" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.762177 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="extract" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.762322 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="37548b9f-0521-4b58-a42a-a023fe66022b" containerName="extract" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.762341 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ce6871-ad8a-4ad9-b804-f3931e0f60ae" containerName="console" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.763093 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.765814 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.765835 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vswvv" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.767182 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.767229 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.767272 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.779509 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt"] Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.859080 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/955961ad-ed3c-4fca-9da0-d9f44684fee8-apiservice-cert\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.859152 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/955961ad-ed3c-4fca-9da0-d9f44684fee8-webhook-cert\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.859226 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd7nj\" (UniqueName: \"kubernetes.io/projected/955961ad-ed3c-4fca-9da0-d9f44684fee8-kube-api-access-xd7nj\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.960693 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/955961ad-ed3c-4fca-9da0-d9f44684fee8-apiservice-cert\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.960765 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/955961ad-ed3c-4fca-9da0-d9f44684fee8-webhook-cert\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.960807 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd7nj\" (UniqueName: \"kubernetes.io/projected/955961ad-ed3c-4fca-9da0-d9f44684fee8-kube-api-access-xd7nj\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.966846 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/955961ad-ed3c-4fca-9da0-d9f44684fee8-webhook-cert\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.966856 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/955961ad-ed3c-4fca-9da0-d9f44684fee8-apiservice-cert\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:08 crc kubenswrapper[4956]: I0314 09:11:08.989137 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd7nj\" (UniqueName: \"kubernetes.io/projected/955961ad-ed3c-4fca-9da0-d9f44684fee8-kube-api-access-xd7nj\") pod \"metallb-operator-controller-manager-6ccbfd57fd-wcxtt\" (UID: \"955961ad-ed3c-4fca-9da0-d9f44684fee8\") " pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.092580 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.280523 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-886f6977-6gxlb"] Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.281346 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.285274 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.285578 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-74k7t" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.285681 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.298295 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-886f6977-6gxlb"] Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.371155 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ead3e16-6e7d-4598-993b-17d3fce555f3-webhook-cert\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.371251 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/6ead3e16-6e7d-4598-993b-17d3fce555f3-kube-api-access-r7qzb\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.371310 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ead3e16-6e7d-4598-993b-17d3fce555f3-apiservice-cert\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.473447 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ead3e16-6e7d-4598-993b-17d3fce555f3-apiservice-cert\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.473702 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ead3e16-6e7d-4598-993b-17d3fce555f3-webhook-cert\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.473767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/6ead3e16-6e7d-4598-993b-17d3fce555f3-kube-api-access-r7qzb\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.483222 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ead3e16-6e7d-4598-993b-17d3fce555f3-webhook-cert\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.484527 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ead3e16-6e7d-4598-993b-17d3fce555f3-apiservice-cert\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.518653 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/6ead3e16-6e7d-4598-993b-17d3fce555f3-kube-api-access-r7qzb\") pod \"metallb-operator-webhook-server-886f6977-6gxlb\" (UID: \"6ead3e16-6e7d-4598-993b-17d3fce555f3\") " pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.618924 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt"] Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.619104 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.630827 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:11:09 crc kubenswrapper[4956]: I0314 09:11:09.887865 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-886f6977-6gxlb"] Mar 14 09:11:10 crc kubenswrapper[4956]: I0314 09:11:10.132715 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" event={"ID":"6ead3e16-6e7d-4598-993b-17d3fce555f3","Type":"ContainerStarted","Data":"765dbd608a0dce17440b661163f245781f4cb3f5bef8d59e82100db79404f0d2"} Mar 14 09:11:10 crc kubenswrapper[4956]: I0314 09:11:10.134105 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" event={"ID":"955961ad-ed3c-4fca-9da0-d9f44684fee8","Type":"ContainerStarted","Data":"a42e51261c343fda49608e329a4b633911bd953bfaaec081507ff21751cd3179"} Mar 14 09:11:16 crc kubenswrapper[4956]: I0314 09:11:16.183864 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" event={"ID":"955961ad-ed3c-4fca-9da0-d9f44684fee8","Type":"ContainerStarted","Data":"abc53fd98551906f4e06f8ff1970753142e1ac79708889fddf028d36b3ae7f79"} Mar 14 09:11:16 crc kubenswrapper[4956]: I0314 09:11:16.184428 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:16 crc kubenswrapper[4956]: I0314 09:11:16.211826 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" podStartSLOduration=4.417244183 podStartE2EDuration="8.211808871s" podCreationTimestamp="2026-03-14 09:11:08 +0000 UTC" firstStartedPulling="2026-03-14 09:11:09.63057076 +0000 UTC m=+875.143263028" lastFinishedPulling="2026-03-14 09:11:13.425135448 +0000 UTC m=+878.937827716" observedRunningTime="2026-03-14 09:11:16.208222951 +0000 UTC m=+881.720915229" watchObservedRunningTime="2026-03-14 09:11:16.211808871 +0000 UTC m=+881.724501129" Mar 14 09:11:17 crc kubenswrapper[4956]: I0314 09:11:17.191522 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" event={"ID":"6ead3e16-6e7d-4598-993b-17d3fce555f3","Type":"ContainerStarted","Data":"47a593970e8e4f232b89ad3afbbbcbe23073dcedf0e6cd9628aecb6daaa0663e"} Mar 14 09:11:17 crc kubenswrapper[4956]: I0314 09:11:17.191876 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:17 crc kubenswrapper[4956]: I0314 09:11:17.208671 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" podStartSLOduration=1.279213213 podStartE2EDuration="8.208652461s" podCreationTimestamp="2026-03-14 09:11:09 +0000 UTC" firstStartedPulling="2026-03-14 09:11:09.897003957 +0000 UTC m=+875.409696225" lastFinishedPulling="2026-03-14 09:11:16.826443205 +0000 UTC m=+882.339135473" observedRunningTime="2026-03-14 09:11:17.207545763 +0000 UTC m=+882.720238031" watchObservedRunningTime="2026-03-14 09:11:17.208652461 +0000 UTC m=+882.721344729" Mar 14 09:11:29 crc kubenswrapper[4956]: I0314 09:11:29.623060 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-886f6977-6gxlb" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.095591 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6ccbfd57fd-wcxtt" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.849463 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-s58l8"] Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.852252 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.854743 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.854905 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fw9bn" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.854750 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.868102 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q"] Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.869046 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.871261 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.892607 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q"] Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.895210 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4zz\" (UniqueName: \"kubernetes.io/projected/1f771f81-34d4-4b3a-b2f9-1791c32f81fa-kube-api-access-zw4zz\") pod \"frr-k8s-webhook-server-bcc4b6f68-nd87q\" (UID: \"1f771f81-34d4-4b3a-b2f9-1791c32f81fa\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.895347 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f771f81-34d4-4b3a-b2f9-1791c32f81fa-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-nd87q\" (UID: \"1f771f81-34d4-4b3a-b2f9-1791c32f81fa\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.962615 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-npbfj"] Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.963545 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-npbfj" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.965923 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jpt4f" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.967230 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.967370 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.967633 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.971991 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-pmz25"] Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.973054 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.975678 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.979968 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-pmz25"] Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.996973 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-reloader\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997141 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn8tg\" (UniqueName: \"kubernetes.io/projected/a9d18aeb-d0f0-4312-945e-2eae3025fd59-kube-api-access-fn8tg\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997267 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-metrics\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997369 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-sockets\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997450 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9d18aeb-d0f0-4312-945e-2eae3025fd59-metrics-certs\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997527 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-startup\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997561 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f771f81-34d4-4b3a-b2f9-1791c32f81fa-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-nd87q\" (UID: \"1f771f81-34d4-4b3a-b2f9-1791c32f81fa\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.997843 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-conf\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:49 crc kubenswrapper[4956]: I0314 09:11:49.998013 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4zz\" (UniqueName: \"kubernetes.io/projected/1f771f81-34d4-4b3a-b2f9-1791c32f81fa-kube-api-access-zw4zz\") pod \"frr-k8s-webhook-server-bcc4b6f68-nd87q\" (UID: \"1f771f81-34d4-4b3a-b2f9-1791c32f81fa\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.015219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f771f81-34d4-4b3a-b2f9-1791c32f81fa-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-nd87q\" (UID: \"1f771f81-34d4-4b3a-b2f9-1791c32f81fa\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.017502 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4zz\" (UniqueName: \"kubernetes.io/projected/1f771f81-34d4-4b3a-b2f9-1791c32f81fa-kube-api-access-zw4zz\") pod \"frr-k8s-webhook-server-bcc4b6f68-nd87q\" (UID: \"1f771f81-34d4-4b3a-b2f9-1791c32f81fa\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098491 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-metallb-excludel2\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098559 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9d18aeb-d0f0-4312-945e-2eae3025fd59-metrics-certs\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098583 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0e2705-3765-4603-97ec-4ff1f5a2bf73-metrics-certs\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098601 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-startup\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098640 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-metrics-certs\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098659 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-conf\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098677 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0e2705-3765-4603-97ec-4ff1f5a2bf73-cert\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098715 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098742 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-reloader\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098762 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv7wj\" (UniqueName: \"kubernetes.io/projected/da0e2705-3765-4603-97ec-4ff1f5a2bf73-kube-api-access-fv7wj\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098786 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn8tg\" (UniqueName: \"kubernetes.io/projected/a9d18aeb-d0f0-4312-945e-2eae3025fd59-kube-api-access-fn8tg\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098809 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qwc\" (UniqueName: \"kubernetes.io/projected/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-kube-api-access-g6qwc\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098829 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-metrics\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.098849 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-sockets\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.099215 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-sockets\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.099412 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-reloader\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.099423 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-metrics\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.099716 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-conf\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.099809 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a9d18aeb-d0f0-4312-945e-2eae3025fd59-frr-startup\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.101569 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9d18aeb-d0f0-4312-945e-2eae3025fd59-metrics-certs\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.116276 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn8tg\" (UniqueName: \"kubernetes.io/projected/a9d18aeb-d0f0-4312-945e-2eae3025fd59-kube-api-access-fn8tg\") pod \"frr-k8s-s58l8\" (UID: \"a9d18aeb-d0f0-4312-945e-2eae3025fd59\") " pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.171691 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-s58l8" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.199800 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0e2705-3765-4603-97ec-4ff1f5a2bf73-cert\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.200068 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.200172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv7wj\" (UniqueName: \"kubernetes.io/projected/da0e2705-3765-4603-97ec-4ff1f5a2bf73-kube-api-access-fv7wj\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.200265 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qwc\" (UniqueName: \"kubernetes.io/projected/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-kube-api-access-g6qwc\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.200359 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-metallb-excludel2\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.200436 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0e2705-3765-4603-97ec-4ff1f5a2bf73-metrics-certs\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.200521 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-metrics-certs\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: E0314 09:11:50.200761 4956 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 09:11:50 crc kubenswrapper[4956]: E0314 09:11:50.200905 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist podName:ce1b2f0a-e73a-4105-ba00-f91d243f6fd9 nodeName:}" failed. No retries permitted until 2026-03-14 09:11:50.700835274 +0000 UTC m=+916.213527542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist") pod "speaker-npbfj" (UID: "ce1b2f0a-e73a-4105-ba00-f91d243f6fd9") : secret "metallb-memberlist" not found Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.201534 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-metallb-excludel2\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.201775 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.204674 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-metrics-certs\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.209642 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.210373 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0e2705-3765-4603-97ec-4ff1f5a2bf73-metrics-certs\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.214056 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0e2705-3765-4603-97ec-4ff1f5a2bf73-cert\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.216958 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv7wj\" (UniqueName: \"kubernetes.io/projected/da0e2705-3765-4603-97ec-4ff1f5a2bf73-kube-api-access-fv7wj\") pod \"controller-7bb4cc7c98-pmz25\" (UID: \"da0e2705-3765-4603-97ec-4ff1f5a2bf73\") " pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.217597 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qwc\" (UniqueName: \"kubernetes.io/projected/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-kube-api-access-g6qwc\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.292795 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.635042 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q"] Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.712903 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:50 crc kubenswrapper[4956]: E0314 09:11:50.713144 4956 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 09:11:50 crc kubenswrapper[4956]: E0314 09:11:50.713263 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist podName:ce1b2f0a-e73a-4105-ba00-f91d243f6fd9 nodeName:}" failed. No retries permitted until 2026-03-14 09:11:51.713235447 +0000 UTC m=+917.225927755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist") pod "speaker-npbfj" (UID: "ce1b2f0a-e73a-4105-ba00-f91d243f6fd9") : secret "metallb-memberlist" not found Mar 14 09:11:50 crc kubenswrapper[4956]: I0314 09:11:50.728382 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-pmz25"] Mar 14 09:11:50 crc kubenswrapper[4956]: W0314 09:11:50.728965 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0e2705_3765_4603_97ec_4ff1f5a2bf73.slice/crio-86897b85ca127c078b026f4d2e8553e3026173a6e04f2b56d2a1a504e299c3db WatchSource:0}: Error finding container 86897b85ca127c078b026f4d2e8553e3026173a6e04f2b56d2a1a504e299c3db: Status 404 returned error can't find the container with id 86897b85ca127c078b026f4d2e8553e3026173a6e04f2b56d2a1a504e299c3db Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.397691 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-pmz25" event={"ID":"da0e2705-3765-4603-97ec-4ff1f5a2bf73","Type":"ContainerStarted","Data":"56ce90d2f9928c4c644a81d8b0085ce7cbe984d355f2c987202a2bf685d44508"} Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.398033 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-pmz25" event={"ID":"da0e2705-3765-4603-97ec-4ff1f5a2bf73","Type":"ContainerStarted","Data":"cd377c51e15936f187f339dd9fb0dcf5cc32fc9394daea0e7735724a47b8cbb3"} Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.398051 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-pmz25" event={"ID":"da0e2705-3765-4603-97ec-4ff1f5a2bf73","Type":"ContainerStarted","Data":"86897b85ca127c078b026f4d2e8553e3026173a6e04f2b56d2a1a504e299c3db"} Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.398110 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.400424 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"383d64f7e8a816fadfef2c6f691a8b39260658483ae025d6241951d38865be9a"} Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.402041 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" event={"ID":"1f771f81-34d4-4b3a-b2f9-1791c32f81fa","Type":"ContainerStarted","Data":"7c9affade47b0f39238d38cfa9beb15284141ef8a56d35d75835cac9b547d2e2"} Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.418941 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-pmz25" podStartSLOduration=2.41892185 podStartE2EDuration="2.41892185s" podCreationTimestamp="2026-03-14 09:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:11:51.417346211 +0000 UTC m=+916.930038479" watchObservedRunningTime="2026-03-14 09:11:51.41892185 +0000 UTC m=+916.931614118" Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.727052 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.734829 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce1b2f0a-e73a-4105-ba00-f91d243f6fd9-memberlist\") pod \"speaker-npbfj\" (UID: \"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9\") " pod="metallb-system/speaker-npbfj" Mar 14 09:11:51 crc kubenswrapper[4956]: I0314 09:11:51.783768 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-npbfj" Mar 14 09:11:51 crc kubenswrapper[4956]: W0314 09:11:51.811047 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce1b2f0a_e73a_4105_ba00_f91d243f6fd9.slice/crio-e62d5505c5c77d600ca8e9384c38b1c1641593669d8649948c803064882e67de WatchSource:0}: Error finding container e62d5505c5c77d600ca8e9384c38b1c1641593669d8649948c803064882e67de: Status 404 returned error can't find the container with id e62d5505c5c77d600ca8e9384c38b1c1641593669d8649948c803064882e67de Mar 14 09:11:52 crc kubenswrapper[4956]: I0314 09:11:52.410440 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-npbfj" event={"ID":"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9","Type":"ContainerStarted","Data":"1e7aa5896239e3d55dd587dc40d4e81e9155b3488539cacc1bb80913c52abccf"} Mar 14 09:11:52 crc kubenswrapper[4956]: I0314 09:11:52.410755 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-npbfj" event={"ID":"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9","Type":"ContainerStarted","Data":"fcdb08768aa62c9999f0b190cfd9cc3f7b429ea46a8290a36280f7036a1f6789"} Mar 14 09:11:52 crc kubenswrapper[4956]: I0314 09:11:52.410765 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-npbfj" event={"ID":"ce1b2f0a-e73a-4105-ba00-f91d243f6fd9","Type":"ContainerStarted","Data":"e62d5505c5c77d600ca8e9384c38b1c1641593669d8649948c803064882e67de"} Mar 14 09:11:52 crc kubenswrapper[4956]: I0314 09:11:52.411890 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-npbfj" Mar 14 09:11:52 crc kubenswrapper[4956]: I0314 09:11:52.432136 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-npbfj" podStartSLOduration=3.432118105 podStartE2EDuration="3.432118105s" podCreationTimestamp="2026-03-14 09:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:11:52.427289025 +0000 UTC m=+917.939981293" watchObservedRunningTime="2026-03-14 09:11:52.432118105 +0000 UTC m=+917.944810373" Mar 14 09:11:59 crc kubenswrapper[4956]: I0314 09:11:59.475056 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" event={"ID":"1f771f81-34d4-4b3a-b2f9-1791c32f81fa","Type":"ContainerStarted","Data":"f346f338b261a35cb6ab98a790485ef3d3ed8e3e26286f9f17b0684932c15b69"} Mar 14 09:11:59 crc kubenswrapper[4956]: I0314 09:11:59.475649 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:11:59 crc kubenswrapper[4956]: I0314 09:11:59.478084 4956 generic.go:334] "Generic (PLEG): container finished" podID="a9d18aeb-d0f0-4312-945e-2eae3025fd59" containerID="c927e9c2c631d8f8c7193c4b43e6e50fe162313d7f476f1cbb5770206e44f5dd" exitCode=0 Mar 14 09:11:59 crc kubenswrapper[4956]: I0314 09:11:59.478130 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerDied","Data":"c927e9c2c631d8f8c7193c4b43e6e50fe162313d7f476f1cbb5770206e44f5dd"} Mar 14 09:11:59 crc kubenswrapper[4956]: I0314 09:11:59.520337 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" podStartSLOduration=2.246104829 podStartE2EDuration="10.520316727s" podCreationTimestamp="2026-03-14 09:11:49 +0000 UTC" firstStartedPulling="2026-03-14 09:11:50.642876976 +0000 UTC m=+916.155569244" lastFinishedPulling="2026-03-14 09:11:58.917088874 +0000 UTC m=+924.429781142" observedRunningTime="2026-03-14 09:11:59.495676083 +0000 UTC m=+925.008368371" watchObservedRunningTime="2026-03-14 09:11:59.520316727 +0000 UTC m=+925.033008995" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.124668 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557992-vcc4s"] Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.125867 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.128137 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.129311 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.131529 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.134936 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-vcc4s"] Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.146509 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwxg\" (UniqueName: \"kubernetes.io/projected/9b3704b7-7a24-482e-b277-c4312f42a29d-kube-api-access-kxwxg\") pod \"auto-csr-approver-29557992-vcc4s\" (UID: \"9b3704b7-7a24-482e-b277-c4312f42a29d\") " pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.247332 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwxg\" (UniqueName: \"kubernetes.io/projected/9b3704b7-7a24-482e-b277-c4312f42a29d-kube-api-access-kxwxg\") pod \"auto-csr-approver-29557992-vcc4s\" (UID: \"9b3704b7-7a24-482e-b277-c4312f42a29d\") " pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.266965 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwxg\" (UniqueName: \"kubernetes.io/projected/9b3704b7-7a24-482e-b277-c4312f42a29d-kube-api-access-kxwxg\") pod \"auto-csr-approver-29557992-vcc4s\" (UID: \"9b3704b7-7a24-482e-b277-c4312f42a29d\") " pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.296439 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-pmz25" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.444895 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.494470 4956 generic.go:334] "Generic (PLEG): container finished" podID="a9d18aeb-d0f0-4312-945e-2eae3025fd59" containerID="4163013ca6110c7b175de37556442fb3689adf6d97d7091c4f95d95dea5d1e4a" exitCode=0 Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.494588 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerDied","Data":"4163013ca6110c7b175de37556442fb3689adf6d97d7091c4f95d95dea5d1e4a"} Mar 14 09:12:00 crc kubenswrapper[4956]: I0314 09:12:00.859676 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-vcc4s"] Mar 14 09:12:00 crc kubenswrapper[4956]: W0314 09:12:00.866363 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3704b7_7a24_482e_b277_c4312f42a29d.slice/crio-3673b69104ca9eb426da06f2951a5619bb761c00f958bfae0b36a6f179701437 WatchSource:0}: Error finding container 3673b69104ca9eb426da06f2951a5619bb761c00f958bfae0b36a6f179701437: Status 404 returned error can't find the container with id 3673b69104ca9eb426da06f2951a5619bb761c00f958bfae0b36a6f179701437 Mar 14 09:12:01 crc kubenswrapper[4956]: I0314 09:12:01.501720 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" event={"ID":"9b3704b7-7a24-482e-b277-c4312f42a29d","Type":"ContainerStarted","Data":"3673b69104ca9eb426da06f2951a5619bb761c00f958bfae0b36a6f179701437"} Mar 14 09:12:01 crc kubenswrapper[4956]: I0314 09:12:01.503640 4956 generic.go:334] "Generic (PLEG): container finished" podID="a9d18aeb-d0f0-4312-945e-2eae3025fd59" containerID="e84f30039feb060e6da2bf5b491896f87f5e9dba8601254d0d3ee50e400b6aef" exitCode=0 Mar 14 09:12:01 crc kubenswrapper[4956]: I0314 09:12:01.503688 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerDied","Data":"e84f30039feb060e6da2bf5b491896f87f5e9dba8601254d0d3ee50e400b6aef"} Mar 14 09:12:02 crc kubenswrapper[4956]: I0314 09:12:02.513267 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"11db4c37100ef9fb70bec766ff264d80698d825dad68ce9310b309aeda4b28fd"} Mar 14 09:12:02 crc kubenswrapper[4956]: I0314 09:12:02.513605 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"9ea63d35420a55a6a7be01ab84be298d231dd8056c31e975fc7084afe0d5ec6d"} Mar 14 09:12:02 crc kubenswrapper[4956]: I0314 09:12:02.513614 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"2e8ce83182636b04de986859723e77de212b7e95b68ee6edb277545c7de9c5b0"} Mar 14 09:12:02 crc kubenswrapper[4956]: I0314 09:12:02.516100 4956 generic.go:334] "Generic (PLEG): container finished" podID="9b3704b7-7a24-482e-b277-c4312f42a29d" containerID="9635f7225b64ac8dfdd57c9a5fb12f2e4c04e99bf58daba2f2b4bbdfa0826df0" exitCode=0 Mar 14 09:12:02 crc kubenswrapper[4956]: I0314 09:12:02.516137 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" event={"ID":"9b3704b7-7a24-482e-b277-c4312f42a29d","Type":"ContainerDied","Data":"9635f7225b64ac8dfdd57c9a5fb12f2e4c04e99bf58daba2f2b4bbdfa0826df0"} Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.527045 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"137da96f645464674d8769de9f00c8403ea7112e7aa19839fdba9285f72fba33"} Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.527348 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"2d679d952b926870ec5dfe1c581e5bc894ab89f3308cba9c4dd8b2693503a305"} Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.528113 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-s58l8" Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.528152 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-s58l8" event={"ID":"a9d18aeb-d0f0-4312-945e-2eae3025fd59","Type":"ContainerStarted","Data":"a8df0a2eda5bdae3021ec09f33d0eb293179f8f663e9102cef79235776df9b64"} Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.555864 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-s58l8" podStartSLOduration=6.18424644 podStartE2EDuration="14.555841153s" podCreationTimestamp="2026-03-14 09:11:49 +0000 UTC" firstStartedPulling="2026-03-14 09:11:50.560458874 +0000 UTC m=+916.073151132" lastFinishedPulling="2026-03-14 09:11:58.932053577 +0000 UTC m=+924.444745845" observedRunningTime="2026-03-14 09:12:03.5517264 +0000 UTC m=+929.064418668" watchObservedRunningTime="2026-03-14 09:12:03.555841153 +0000 UTC m=+929.068533431" Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.789764 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.891702 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxwxg\" (UniqueName: \"kubernetes.io/projected/9b3704b7-7a24-482e-b277-c4312f42a29d-kube-api-access-kxwxg\") pod \"9b3704b7-7a24-482e-b277-c4312f42a29d\" (UID: \"9b3704b7-7a24-482e-b277-c4312f42a29d\") " Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.896737 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3704b7-7a24-482e-b277-c4312f42a29d-kube-api-access-kxwxg" (OuterVolumeSpecName: "kube-api-access-kxwxg") pod "9b3704b7-7a24-482e-b277-c4312f42a29d" (UID: "9b3704b7-7a24-482e-b277-c4312f42a29d"). InnerVolumeSpecName "kube-api-access-kxwxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:03 crc kubenswrapper[4956]: I0314 09:12:03.993648 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxwxg\" (UniqueName: \"kubernetes.io/projected/9b3704b7-7a24-482e-b277-c4312f42a29d-kube-api-access-kxwxg\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:04 crc kubenswrapper[4956]: I0314 09:12:04.535101 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" event={"ID":"9b3704b7-7a24-482e-b277-c4312f42a29d","Type":"ContainerDied","Data":"3673b69104ca9eb426da06f2951a5619bb761c00f958bfae0b36a6f179701437"} Mar 14 09:12:04 crc kubenswrapper[4956]: I0314 09:12:04.535135 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-vcc4s" Mar 14 09:12:04 crc kubenswrapper[4956]: I0314 09:12:04.535148 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3673b69104ca9eb426da06f2951a5619bb761c00f958bfae0b36a6f179701437" Mar 14 09:12:04 crc kubenswrapper[4956]: I0314 09:12:04.834518 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-xg5j9"] Mar 14 09:12:04 crc kubenswrapper[4956]: I0314 09:12:04.838219 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-xg5j9"] Mar 14 09:12:05 crc kubenswrapper[4956]: I0314 09:12:05.172294 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-s58l8" Mar 14 09:12:05 crc kubenswrapper[4956]: I0314 09:12:05.231259 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f526a2-e2d3-4894-9088-69073b2af2fd" path="/var/lib/kubelet/pods/03f526a2-e2d3-4894-9088-69073b2af2fd/volumes" Mar 14 09:12:05 crc kubenswrapper[4956]: I0314 09:12:05.232143 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-s58l8" Mar 14 09:12:10 crc kubenswrapper[4956]: I0314 09:12:10.213626 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nd87q" Mar 14 09:12:11 crc kubenswrapper[4956]: I0314 09:12:11.788708 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-npbfj" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.141498 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9"] Mar 14 09:12:13 crc kubenswrapper[4956]: E0314 09:12:13.142089 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3704b7-7a24-482e-b277-c4312f42a29d" containerName="oc" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.142106 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3704b7-7a24-482e-b277-c4312f42a29d" containerName="oc" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.142223 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3704b7-7a24-482e-b277-c4312f42a29d" containerName="oc" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.143173 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.144935 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.151909 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9"] Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.312533 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.312594 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88fz\" (UniqueName: \"kubernetes.io/projected/a710d73a-4449-44aa-b6e0-198adf069d08-kube-api-access-w88fz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.312689 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.414103 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.414189 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.414232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88fz\" (UniqueName: \"kubernetes.io/projected/a710d73a-4449-44aa-b6e0-198adf069d08-kube-api-access-w88fz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.414764 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.415032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.433380 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88fz\" (UniqueName: \"kubernetes.io/projected/a710d73a-4449-44aa-b6e0-198adf069d08-kube-api-access-w88fz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.469933 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:13 crc kubenswrapper[4956]: I0314 09:12:13.942833 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9"] Mar 14 09:12:14 crc kubenswrapper[4956]: I0314 09:12:14.594730 4956 generic.go:334] "Generic (PLEG): container finished" podID="a710d73a-4449-44aa-b6e0-198adf069d08" containerID="30e227f28ca53464dc1fb6a7b6a073c166b417212f49d35c049bb5ba6158fd76" exitCode=0 Mar 14 09:12:14 crc kubenswrapper[4956]: I0314 09:12:14.594772 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" event={"ID":"a710d73a-4449-44aa-b6e0-198adf069d08","Type":"ContainerDied","Data":"30e227f28ca53464dc1fb6a7b6a073c166b417212f49d35c049bb5ba6158fd76"} Mar 14 09:12:14 crc kubenswrapper[4956]: I0314 09:12:14.595003 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" event={"ID":"a710d73a-4449-44aa-b6e0-198adf069d08","Type":"ContainerStarted","Data":"21268fb24b1142bb1422bbb86bfcd4022a0e14c473ad3c31a4c0c2b2990b2121"} Mar 14 09:12:18 crc kubenswrapper[4956]: I0314 09:12:18.627215 4956 generic.go:334] "Generic (PLEG): container finished" podID="a710d73a-4449-44aa-b6e0-198adf069d08" containerID="f342691c0d49ac34ed6ac0a922805571cd6777abbd886ba6e91d888c7717205a" exitCode=0 Mar 14 09:12:18 crc kubenswrapper[4956]: I0314 09:12:18.627298 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" event={"ID":"a710d73a-4449-44aa-b6e0-198adf069d08","Type":"ContainerDied","Data":"f342691c0d49ac34ed6ac0a922805571cd6777abbd886ba6e91d888c7717205a"} Mar 14 09:12:19 crc kubenswrapper[4956]: I0314 09:12:19.634921 4956 generic.go:334] "Generic (PLEG): container finished" podID="a710d73a-4449-44aa-b6e0-198adf069d08" containerID="d1fdcd543c51d59ff4033c152424363e8f1544547b59f8f831f441d32e70a902" exitCode=0 Mar 14 09:12:19 crc kubenswrapper[4956]: I0314 09:12:19.635023 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" event={"ID":"a710d73a-4449-44aa-b6e0-198adf069d08","Type":"ContainerDied","Data":"d1fdcd543c51d59ff4033c152424363e8f1544547b59f8f831f441d32e70a902"} Mar 14 09:12:20 crc kubenswrapper[4956]: I0314 09:12:20.177665 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-s58l8" Mar 14 09:12:20 crc kubenswrapper[4956]: I0314 09:12:20.932675 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.125646 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-util\") pod \"a710d73a-4449-44aa-b6e0-198adf069d08\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.125702 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88fz\" (UniqueName: \"kubernetes.io/projected/a710d73a-4449-44aa-b6e0-198adf069d08-kube-api-access-w88fz\") pod \"a710d73a-4449-44aa-b6e0-198adf069d08\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.125736 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-bundle\") pod \"a710d73a-4449-44aa-b6e0-198adf069d08\" (UID: \"a710d73a-4449-44aa-b6e0-198adf069d08\") " Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.126936 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-bundle" (OuterVolumeSpecName: "bundle") pod "a710d73a-4449-44aa-b6e0-198adf069d08" (UID: "a710d73a-4449-44aa-b6e0-198adf069d08"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.132305 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a710d73a-4449-44aa-b6e0-198adf069d08-kube-api-access-w88fz" (OuterVolumeSpecName: "kube-api-access-w88fz") pod "a710d73a-4449-44aa-b6e0-198adf069d08" (UID: "a710d73a-4449-44aa-b6e0-198adf069d08"). InnerVolumeSpecName "kube-api-access-w88fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.137665 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-util" (OuterVolumeSpecName: "util") pod "a710d73a-4449-44aa-b6e0-198adf069d08" (UID: "a710d73a-4449-44aa-b6e0-198adf069d08"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.227255 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.227288 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88fz\" (UniqueName: \"kubernetes.io/projected/a710d73a-4449-44aa-b6e0-198adf069d08-kube-api-access-w88fz\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.227304 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a710d73a-4449-44aa-b6e0-198adf069d08-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.650510 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" event={"ID":"a710d73a-4449-44aa-b6e0-198adf069d08","Type":"ContainerDied","Data":"21268fb24b1142bb1422bbb86bfcd4022a0e14c473ad3c31a4c0c2b2990b2121"} Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.650553 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9" Mar 14 09:12:21 crc kubenswrapper[4956]: I0314 09:12:21.650568 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21268fb24b1142bb1422bbb86bfcd4022a0e14c473ad3c31a4c0c2b2990b2121" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.489632 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc"] Mar 14 09:12:26 crc kubenswrapper[4956]: E0314 09:12:26.490460 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="extract" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.490491 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="extract" Mar 14 09:12:26 crc kubenswrapper[4956]: E0314 09:12:26.490511 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="pull" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.490519 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="pull" Mar 14 09:12:26 crc kubenswrapper[4956]: E0314 09:12:26.490529 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="util" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.490536 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="util" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.490669 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a710d73a-4449-44aa-b6e0-198adf069d08" containerName="extract" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.491197 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.498733 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.499422 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.499466 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-vgv5z" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.518696 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc"] Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.596933 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc8d2a29-78e2-4bc0-9efc-5da547e43e0b-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2lhwc\" (UID: \"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.597314 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbw6\" (UniqueName: \"kubernetes.io/projected/cc8d2a29-78e2-4bc0-9efc-5da547e43e0b-kube-api-access-9vbw6\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2lhwc\" (UID: \"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.698733 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbw6\" (UniqueName: \"kubernetes.io/projected/cc8d2a29-78e2-4bc0-9efc-5da547e43e0b-kube-api-access-9vbw6\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2lhwc\" (UID: \"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.698876 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc8d2a29-78e2-4bc0-9efc-5da547e43e0b-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2lhwc\" (UID: \"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.699415 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc8d2a29-78e2-4bc0-9efc-5da547e43e0b-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2lhwc\" (UID: \"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.721343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbw6\" (UniqueName: \"kubernetes.io/projected/cc8d2a29-78e2-4bc0-9efc-5da547e43e0b-kube-api-access-9vbw6\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2lhwc\" (UID: \"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:26 crc kubenswrapper[4956]: I0314 09:12:26.810215 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" Mar 14 09:12:27 crc kubenswrapper[4956]: I0314 09:12:27.260285 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc"] Mar 14 09:12:27 crc kubenswrapper[4956]: I0314 09:12:27.690764 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" event={"ID":"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b","Type":"ContainerStarted","Data":"974011515dfafac3628735ffbb2e1f620d0300776c6e3b4407ff27aa89a77eae"} Mar 14 09:12:31 crc kubenswrapper[4956]: I0314 09:12:31.733820 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" event={"ID":"cc8d2a29-78e2-4bc0-9efc-5da547e43e0b","Type":"ContainerStarted","Data":"048761746b00fa94261182719485e1f966a46b1e6e2ddec01ef0ce73ec40d047"} Mar 14 09:12:31 crc kubenswrapper[4956]: I0314 09:12:31.762151 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2lhwc" podStartSLOduration=2.235350584 podStartE2EDuration="5.762126608s" podCreationTimestamp="2026-03-14 09:12:26 +0000 UTC" firstStartedPulling="2026-03-14 09:12:27.267014385 +0000 UTC m=+952.779706653" lastFinishedPulling="2026-03-14 09:12:30.793790409 +0000 UTC m=+956.306482677" observedRunningTime="2026-03-14 09:12:31.756391526 +0000 UTC m=+957.269083804" watchObservedRunningTime="2026-03-14 09:12:31.762126608 +0000 UTC m=+957.274818876" Mar 14 09:12:33 crc kubenswrapper[4956]: I0314 09:12:33.900414 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2cc2c"] Mar 14 09:12:33 crc kubenswrapper[4956]: I0314 09:12:33.902262 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:33 crc kubenswrapper[4956]: I0314 09:12:33.909241 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2cc2c"] Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.007074 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-utilities\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.007164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-catalog-content\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.007217 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2hr4\" (UniqueName: \"kubernetes.io/projected/0d890a74-2b52-43f3-91bb-8d1730a21770-kube-api-access-s2hr4\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.108901 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-catalog-content\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.108960 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2hr4\" (UniqueName: \"kubernetes.io/projected/0d890a74-2b52-43f3-91bb-8d1730a21770-kube-api-access-s2hr4\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.109028 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-utilities\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.109433 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-utilities\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.109541 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-catalog-content\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.145796 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2hr4\" (UniqueName: \"kubernetes.io/projected/0d890a74-2b52-43f3-91bb-8d1730a21770-kube-api-access-s2hr4\") pod \"certified-operators-2cc2c\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.222292 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.677196 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2cc2c"] Mar 14 09:12:34 crc kubenswrapper[4956]: I0314 09:12:34.757030 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerStarted","Data":"5d453b9efa9b599bf2de5b5dbb6dbee332e26be5615f1433aa6f0b395ee6fad6"} Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.000291 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-hfb75"] Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.001226 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.003465 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wnthn" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.003503 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.003465 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.019725 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470dcdf7-6d52-4164-bcfe-c8922d387147-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-hfb75\" (UID: \"470dcdf7-6d52-4164-bcfe-c8922d387147\") " pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.019874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhk4\" (UniqueName: \"kubernetes.io/projected/470dcdf7-6d52-4164-bcfe-c8922d387147-kube-api-access-kjhk4\") pod \"cert-manager-webhook-6888856db4-hfb75\" (UID: \"470dcdf7-6d52-4164-bcfe-c8922d387147\") " pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.024732 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-hfb75"] Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.120883 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhk4\" (UniqueName: \"kubernetes.io/projected/470dcdf7-6d52-4164-bcfe-c8922d387147-kube-api-access-kjhk4\") pod \"cert-manager-webhook-6888856db4-hfb75\" (UID: \"470dcdf7-6d52-4164-bcfe-c8922d387147\") " pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.120999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470dcdf7-6d52-4164-bcfe-c8922d387147-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-hfb75\" (UID: \"470dcdf7-6d52-4164-bcfe-c8922d387147\") " pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.141508 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470dcdf7-6d52-4164-bcfe-c8922d387147-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-hfb75\" (UID: \"470dcdf7-6d52-4164-bcfe-c8922d387147\") " pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.143367 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhk4\" (UniqueName: \"kubernetes.io/projected/470dcdf7-6d52-4164-bcfe-c8922d387147-kube-api-access-kjhk4\") pod \"cert-manager-webhook-6888856db4-hfb75\" (UID: \"470dcdf7-6d52-4164-bcfe-c8922d387147\") " pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.318255 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wnthn" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.326174 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.704026 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-hfb75"] Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.771741 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" event={"ID":"470dcdf7-6d52-4164-bcfe-c8922d387147","Type":"ContainerStarted","Data":"bede78e3b2ed1af318fc4863383834ffc12224fc0cd9d833fe34e28d7951c187"} Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.772957 4956 generic.go:334] "Generic (PLEG): container finished" podID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerID="fc1df94221d44525129f421b964108841e18f2081684894230866f0c8591d7b9" exitCode=0 Mar 14 09:12:35 crc kubenswrapper[4956]: I0314 09:12:35.772982 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerDied","Data":"fc1df94221d44525129f421b964108841e18f2081684894230866f0c8591d7b9"} Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.090257 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lmlng"] Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.091273 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.098573 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-tjbq7" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.109032 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lmlng"] Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.143958 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ab2a4c0-0a05-4ffa-9900-9af5dbe961df-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lmlng\" (UID: \"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.144023 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtkh\" (UniqueName: \"kubernetes.io/projected/2ab2a4c0-0a05-4ffa-9900-9af5dbe961df-kube-api-access-twtkh\") pod \"cert-manager-cainjector-5545bd876-lmlng\" (UID: \"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.244981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twtkh\" (UniqueName: \"kubernetes.io/projected/2ab2a4c0-0a05-4ffa-9900-9af5dbe961df-kube-api-access-twtkh\") pod \"cert-manager-cainjector-5545bd876-lmlng\" (UID: \"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.245333 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ab2a4c0-0a05-4ffa-9900-9af5dbe961df-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lmlng\" (UID: \"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.268596 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ab2a4c0-0a05-4ffa-9900-9af5dbe961df-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lmlng\" (UID: \"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.268627 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtkh\" (UniqueName: \"kubernetes.io/projected/2ab2a4c0-0a05-4ffa-9900-9af5dbe961df-kube-api-access-twtkh\") pod \"cert-manager-cainjector-5545bd876-lmlng\" (UID: \"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.435070 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.716991 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lmlng"] Mar 14 09:12:36 crc kubenswrapper[4956]: I0314 09:12:36.782382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" event={"ID":"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df","Type":"ContainerStarted","Data":"52df994f78234ee13895688f247516a5d75292fcef4f70a377e9118472d3e179"} Mar 14 09:12:37 crc kubenswrapper[4956]: I0314 09:12:37.816617 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerStarted","Data":"e254f8f52c03aecdab2668839a1a70a06897788e78e16617e0a802cc742789ed"} Mar 14 09:12:38 crc kubenswrapper[4956]: I0314 09:12:38.839553 4956 generic.go:334] "Generic (PLEG): container finished" podID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerID="e254f8f52c03aecdab2668839a1a70a06897788e78e16617e0a802cc742789ed" exitCode=0 Mar 14 09:12:38 crc kubenswrapper[4956]: I0314 09:12:38.839691 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerDied","Data":"e254f8f52c03aecdab2668839a1a70a06897788e78e16617e0a802cc742789ed"} Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.304332 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jzrsq"] Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.306340 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.327310 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzrsq"] Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.414656 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-utilities\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.415112 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rx8\" (UniqueName: \"kubernetes.io/projected/cfade054-fee3-4b75-9b0c-5b072fd47a61-kube-api-access-p6rx8\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.415167 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-catalog-content\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.523255 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-catalog-content\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.523333 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-utilities\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.523430 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rx8\" (UniqueName: \"kubernetes.io/projected/cfade054-fee3-4b75-9b0c-5b072fd47a61-kube-api-access-p6rx8\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.523872 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-catalog-content\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.523884 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-utilities\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.552571 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rx8\" (UniqueName: \"kubernetes.io/projected/cfade054-fee3-4b75-9b0c-5b072fd47a61-kube-api-access-p6rx8\") pod \"community-operators-jzrsq\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:40 crc kubenswrapper[4956]: I0314 09:12:40.642534 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:41 crc kubenswrapper[4956]: I0314 09:12:41.805407 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzrsq"] Mar 14 09:12:41 crc kubenswrapper[4956]: W0314 09:12:41.821123 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfade054_fee3_4b75_9b0c_5b072fd47a61.slice/crio-bf6f1605f320dc4c284313fde62529036e3c7ca2eef7359a98ae9138c7809fed WatchSource:0}: Error finding container bf6f1605f320dc4c284313fde62529036e3c7ca2eef7359a98ae9138c7809fed: Status 404 returned error can't find the container with id bf6f1605f320dc4c284313fde62529036e3c7ca2eef7359a98ae9138c7809fed Mar 14 09:12:41 crc kubenswrapper[4956]: I0314 09:12:41.862347 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerStarted","Data":"bf6f1605f320dc4c284313fde62529036e3c7ca2eef7359a98ae9138c7809fed"} Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.869768 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" event={"ID":"2ab2a4c0-0a05-4ffa-9900-9af5dbe961df","Type":"ContainerStarted","Data":"d18b83e19600d1c58349de767fffc38051096cf634c3ffbfa08f2eefe05d2751"} Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.872558 4956 generic.go:334] "Generic (PLEG): container finished" podID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerID="45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254" exitCode=0 Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.872636 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerDied","Data":"45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254"} Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.874548 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" event={"ID":"470dcdf7-6d52-4164-bcfe-c8922d387147","Type":"ContainerStarted","Data":"65a6c97358fb29a191ab52a336119f9bbf26790f1ae90a63bbe793da2f3eefc8"} Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.874648 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.878968 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerStarted","Data":"45058b6455179e6be1ad01abc824fe676b24f08178e43e0b97b13086e876838f"} Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.889627 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-lmlng" podStartSLOduration=1.890137052 podStartE2EDuration="6.889604509s" podCreationTimestamp="2026-03-14 09:12:36 +0000 UTC" firstStartedPulling="2026-03-14 09:12:36.735703901 +0000 UTC m=+962.248396169" lastFinishedPulling="2026-03-14 09:12:41.735171358 +0000 UTC m=+967.247863626" observedRunningTime="2026-03-14 09:12:42.887200009 +0000 UTC m=+968.399892277" watchObservedRunningTime="2026-03-14 09:12:42.889604509 +0000 UTC m=+968.402296787" Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.917818 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" podStartSLOduration=2.9266012310000002 podStartE2EDuration="8.91779307s" podCreationTimestamp="2026-03-14 09:12:34 +0000 UTC" firstStartedPulling="2026-03-14 09:12:35.708105506 +0000 UTC m=+961.220797774" lastFinishedPulling="2026-03-14 09:12:41.699297345 +0000 UTC m=+967.211989613" observedRunningTime="2026-03-14 09:12:42.914421816 +0000 UTC m=+968.427114074" watchObservedRunningTime="2026-03-14 09:12:42.91779307 +0000 UTC m=+968.430485338" Mar 14 09:12:42 crc kubenswrapper[4956]: I0314 09:12:42.956177 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2cc2c" podStartSLOduration=4.051976951 podStartE2EDuration="9.956156725s" podCreationTimestamp="2026-03-14 09:12:33 +0000 UTC" firstStartedPulling="2026-03-14 09:12:35.782445046 +0000 UTC m=+961.295137314" lastFinishedPulling="2026-03-14 09:12:41.68662482 +0000 UTC m=+967.199317088" observedRunningTime="2026-03-14 09:12:42.95071668 +0000 UTC m=+968.463408948" watchObservedRunningTime="2026-03-14 09:12:42.956156725 +0000 UTC m=+968.468848993" Mar 14 09:12:43 crc kubenswrapper[4956]: I0314 09:12:43.903000 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerStarted","Data":"aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23"} Mar 14 09:12:44 crc kubenswrapper[4956]: I0314 09:12:44.223269 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:44 crc kubenswrapper[4956]: I0314 09:12:44.223351 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:44 crc kubenswrapper[4956]: I0314 09:12:44.314495 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:44 crc kubenswrapper[4956]: I0314 09:12:44.911975 4956 generic.go:334] "Generic (PLEG): container finished" podID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerID="aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23" exitCode=0 Mar 14 09:12:44 crc kubenswrapper[4956]: I0314 09:12:44.912078 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerDied","Data":"aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23"} Mar 14 09:12:45 crc kubenswrapper[4956]: I0314 09:12:45.920002 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerStarted","Data":"27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c"} Mar 14 09:12:45 crc kubenswrapper[4956]: I0314 09:12:45.934984 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jzrsq" podStartSLOduration=3.51330199 podStartE2EDuration="5.934964771s" podCreationTimestamp="2026-03-14 09:12:40 +0000 UTC" firstStartedPulling="2026-03-14 09:12:42.874569024 +0000 UTC m=+968.387261292" lastFinishedPulling="2026-03-14 09:12:45.296231775 +0000 UTC m=+970.808924073" observedRunningTime="2026-03-14 09:12:45.933411632 +0000 UTC m=+971.446103900" watchObservedRunningTime="2026-03-14 09:12:45.934964771 +0000 UTC m=+971.447657039" Mar 14 09:12:46 crc kubenswrapper[4956]: I0314 09:12:46.629521 4956 scope.go:117] "RemoveContainer" containerID="ebf340fa253d07b31a51fb0a4060c27fc5e0c80c1c4c4539b91403c6683bd953" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.703810 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r2qlr"] Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.707401 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.715546 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2qlr"] Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.732109 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf54v\" (UniqueName: \"kubernetes.io/projected/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-kube-api-access-xf54v\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.732178 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-catalog-content\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.732651 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-utilities\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.834102 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf54v\" (UniqueName: \"kubernetes.io/projected/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-kube-api-access-xf54v\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.834159 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-catalog-content\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.834235 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-utilities\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.834687 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-catalog-content\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.834719 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-utilities\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:48 crc kubenswrapper[4956]: I0314 09:12:48.856181 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf54v\" (UniqueName: \"kubernetes.io/projected/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-kube-api-access-xf54v\") pod \"redhat-marketplace-r2qlr\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:49 crc kubenswrapper[4956]: I0314 09:12:49.036546 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:49 crc kubenswrapper[4956]: I0314 09:12:49.524666 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2qlr"] Mar 14 09:12:49 crc kubenswrapper[4956]: I0314 09:12:49.946776 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2qlr" event={"ID":"6df6b026-36f0-4a23-8a8b-42d11e38dfdd","Type":"ContainerStarted","Data":"c3b7d9b0c3ee6727a4e01c2d2b7234e283bfe22f376601583c3fd79b50611d9b"} Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.329996 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-hfb75" Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.643282 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.643381 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.681790 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.954606 4956 generic.go:334] "Generic (PLEG): container finished" podID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerID="5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7" exitCode=0 Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.954706 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2qlr" event={"ID":"6df6b026-36f0-4a23-8a8b-42d11e38dfdd","Type":"ContainerDied","Data":"5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7"} Mar 14 09:12:50 crc kubenswrapper[4956]: I0314 09:12:50.993960 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.034825 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-jmg2k"] Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.035579 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.037122 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-krrxp" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.047749 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jmg2k"] Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.092857 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchvf\" (UniqueName: \"kubernetes.io/projected/04227b43-3672-4848-84f8-275b2ec997d8-kube-api-access-wchvf\") pod \"cert-manager-545d4d4674-jmg2k\" (UID: \"04227b43-3672-4848-84f8-275b2ec997d8\") " pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.093406 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04227b43-3672-4848-84f8-275b2ec997d8-bound-sa-token\") pod \"cert-manager-545d4d4674-jmg2k\" (UID: \"04227b43-3672-4848-84f8-275b2ec997d8\") " pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.094943 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzrsq"] Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.195430 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchvf\" (UniqueName: \"kubernetes.io/projected/04227b43-3672-4848-84f8-275b2ec997d8-kube-api-access-wchvf\") pod \"cert-manager-545d4d4674-jmg2k\" (UID: \"04227b43-3672-4848-84f8-275b2ec997d8\") " pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.195651 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04227b43-3672-4848-84f8-275b2ec997d8-bound-sa-token\") pod \"cert-manager-545d4d4674-jmg2k\" (UID: \"04227b43-3672-4848-84f8-275b2ec997d8\") " pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.216864 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchvf\" (UniqueName: \"kubernetes.io/projected/04227b43-3672-4848-84f8-275b2ec997d8-kube-api-access-wchvf\") pod \"cert-manager-545d4d4674-jmg2k\" (UID: \"04227b43-3672-4848-84f8-275b2ec997d8\") " pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.217379 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04227b43-3672-4848-84f8-275b2ec997d8-bound-sa-token\") pod \"cert-manager-545d4d4674-jmg2k\" (UID: \"04227b43-3672-4848-84f8-275b2ec997d8\") " pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.355152 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jmg2k" Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.813971 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jmg2k"] Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.991845 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jmg2k" event={"ID":"04227b43-3672-4848-84f8-275b2ec997d8","Type":"ContainerStarted","Data":"ca04319faf858d6f27927448beac3f1152d59bdd098ff965eb63ae4e7b361581"} Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.991898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jmg2k" event={"ID":"04227b43-3672-4848-84f8-275b2ec997d8","Type":"ContainerStarted","Data":"efcdff4927dabe2449e7dc3ed64db8307c94175adffe3821d749058c364d7947"} Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.994186 4956 generic.go:334] "Generic (PLEG): container finished" podID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerID="0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8" exitCode=0 Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.994278 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2qlr" event={"ID":"6df6b026-36f0-4a23-8a8b-42d11e38dfdd","Type":"ContainerDied","Data":"0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8"} Mar 14 09:12:53 crc kubenswrapper[4956]: I0314 09:12:53.994764 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jzrsq" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="registry-server" containerID="cri-o://27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c" gracePeriod=2 Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.021722 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-jmg2k" podStartSLOduration=1.021690801 podStartE2EDuration="1.021690801s" podCreationTimestamp="2026-03-14 09:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:12:54.013465266 +0000 UTC m=+979.526157574" watchObservedRunningTime="2026-03-14 09:12:54.021690801 +0000 UTC m=+979.534383069" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.307109 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.471188 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.516202 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-utilities\") pod \"cfade054-fee3-4b75-9b0c-5b072fd47a61\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.516334 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-catalog-content\") pod \"cfade054-fee3-4b75-9b0c-5b072fd47a61\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.516376 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rx8\" (UniqueName: \"kubernetes.io/projected/cfade054-fee3-4b75-9b0c-5b072fd47a61-kube-api-access-p6rx8\") pod \"cfade054-fee3-4b75-9b0c-5b072fd47a61\" (UID: \"cfade054-fee3-4b75-9b0c-5b072fd47a61\") " Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.517316 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-utilities" (OuterVolumeSpecName: "utilities") pod "cfade054-fee3-4b75-9b0c-5b072fd47a61" (UID: "cfade054-fee3-4b75-9b0c-5b072fd47a61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.523765 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfade054-fee3-4b75-9b0c-5b072fd47a61-kube-api-access-p6rx8" (OuterVolumeSpecName: "kube-api-access-p6rx8") pod "cfade054-fee3-4b75-9b0c-5b072fd47a61" (UID: "cfade054-fee3-4b75-9b0c-5b072fd47a61"). InnerVolumeSpecName "kube-api-access-p6rx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.572174 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfade054-fee3-4b75-9b0c-5b072fd47a61" (UID: "cfade054-fee3-4b75-9b0c-5b072fd47a61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.617992 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.618027 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6rx8\" (UniqueName: \"kubernetes.io/projected/cfade054-fee3-4b75-9b0c-5b072fd47a61-kube-api-access-p6rx8\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:54 crc kubenswrapper[4956]: I0314 09:12:54.618038 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfade054-fee3-4b75-9b0c-5b072fd47a61-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.005149 4956 generic.go:334] "Generic (PLEG): container finished" podID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerID="27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c" exitCode=0 Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.005229 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzrsq" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.005250 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerDied","Data":"27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c"} Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.006110 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzrsq" event={"ID":"cfade054-fee3-4b75-9b0c-5b072fd47a61","Type":"ContainerDied","Data":"bf6f1605f320dc4c284313fde62529036e3c7ca2eef7359a98ae9138c7809fed"} Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.006141 4956 scope.go:117] "RemoveContainer" containerID="27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.008876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2qlr" event={"ID":"6df6b026-36f0-4a23-8a8b-42d11e38dfdd","Type":"ContainerStarted","Data":"2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009"} Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.021739 4956 scope.go:117] "RemoveContainer" containerID="aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.038330 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r2qlr" podStartSLOduration=3.498511347 podStartE2EDuration="7.038308292s" podCreationTimestamp="2026-03-14 09:12:48 +0000 UTC" firstStartedPulling="2026-03-14 09:12:50.956268492 +0000 UTC m=+976.468960760" lastFinishedPulling="2026-03-14 09:12:54.496065437 +0000 UTC m=+980.008757705" observedRunningTime="2026-03-14 09:12:55.033309158 +0000 UTC m=+980.546001446" watchObservedRunningTime="2026-03-14 09:12:55.038308292 +0000 UTC m=+980.551000560" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.059246 4956 scope.go:117] "RemoveContainer" containerID="45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.062608 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzrsq"] Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.072545 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jzrsq"] Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.083132 4956 scope.go:117] "RemoveContainer" containerID="27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c" Mar 14 09:12:55 crc kubenswrapper[4956]: E0314 09:12:55.083904 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c\": container with ID starting with 27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c not found: ID does not exist" containerID="27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.083968 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c"} err="failed to get container status \"27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c\": rpc error: code = NotFound desc = could not find container \"27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c\": container with ID starting with 27f2986f9ba2f5fdfaa8117c4ee647c9c1c39258449c5dbb233388e1cad56c8c not found: ID does not exist" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.083996 4956 scope.go:117] "RemoveContainer" containerID="aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23" Mar 14 09:12:55 crc kubenswrapper[4956]: E0314 09:12:55.084481 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23\": container with ID starting with aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23 not found: ID does not exist" containerID="aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.084557 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23"} err="failed to get container status \"aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23\": rpc error: code = NotFound desc = could not find container \"aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23\": container with ID starting with aba869eab783e564e26aad703cc4f28d947525657d7b66b72565499eda75cc23 not found: ID does not exist" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.084599 4956 scope.go:117] "RemoveContainer" containerID="45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254" Mar 14 09:12:55 crc kubenswrapper[4956]: E0314 09:12:55.093011 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254\": container with ID starting with 45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254 not found: ID does not exist" containerID="45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.093104 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254"} err="failed to get container status \"45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254\": rpc error: code = NotFound desc = could not find container \"45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254\": container with ID starting with 45ffc2ba153ee3e2db6d83bd7a16e21a4e4838b6e2e0faaead3af412487f4254 not found: ID does not exist" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.218964 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" path="/var/lib/kubelet/pods/cfade054-fee3-4b75-9b0c-5b072fd47a61/volumes" Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.424009 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:12:55 crc kubenswrapper[4956]: I0314 09:12:55.424100 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:12:56 crc kubenswrapper[4956]: I0314 09:12:56.895836 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2cc2c"] Mar 14 09:12:56 crc kubenswrapper[4956]: I0314 09:12:56.896398 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2cc2c" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="registry-server" containerID="cri-o://45058b6455179e6be1ad01abc824fe676b24f08178e43e0b97b13086e876838f" gracePeriod=2 Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.027476 4956 generic.go:334] "Generic (PLEG): container finished" podID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerID="45058b6455179e6be1ad01abc824fe676b24f08178e43e0b97b13086e876838f" exitCode=0 Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.027530 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerDied","Data":"45058b6455179e6be1ad01abc824fe676b24f08178e43e0b97b13086e876838f"} Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.303120 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.367967 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-utilities\") pod \"0d890a74-2b52-43f3-91bb-8d1730a21770\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.368037 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-catalog-content\") pod \"0d890a74-2b52-43f3-91bb-8d1730a21770\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.368093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2hr4\" (UniqueName: \"kubernetes.io/projected/0d890a74-2b52-43f3-91bb-8d1730a21770-kube-api-access-s2hr4\") pod \"0d890a74-2b52-43f3-91bb-8d1730a21770\" (UID: \"0d890a74-2b52-43f3-91bb-8d1730a21770\") " Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.369458 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-utilities" (OuterVolumeSpecName: "utilities") pod "0d890a74-2b52-43f3-91bb-8d1730a21770" (UID: "0d890a74-2b52-43f3-91bb-8d1730a21770"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.375583 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d890a74-2b52-43f3-91bb-8d1730a21770-kube-api-access-s2hr4" (OuterVolumeSpecName: "kube-api-access-s2hr4") pod "0d890a74-2b52-43f3-91bb-8d1730a21770" (UID: "0d890a74-2b52-43f3-91bb-8d1730a21770"). InnerVolumeSpecName "kube-api-access-s2hr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.421532 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d890a74-2b52-43f3-91bb-8d1730a21770" (UID: "0d890a74-2b52-43f3-91bb-8d1730a21770"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.470246 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.470290 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2hr4\" (UniqueName: \"kubernetes.io/projected/0d890a74-2b52-43f3-91bb-8d1730a21770-kube-api-access-s2hr4\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:57 crc kubenswrapper[4956]: I0314 09:12:57.470304 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d890a74-2b52-43f3-91bb-8d1730a21770-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.039344 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cc2c" event={"ID":"0d890a74-2b52-43f3-91bb-8d1730a21770","Type":"ContainerDied","Data":"5d453b9efa9b599bf2de5b5dbb6dbee332e26be5615f1433aa6f0b395ee6fad6"} Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.039581 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cc2c" Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.041205 4956 scope.go:117] "RemoveContainer" containerID="45058b6455179e6be1ad01abc824fe676b24f08178e43e0b97b13086e876838f" Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.067195 4956 scope.go:117] "RemoveContainer" containerID="e254f8f52c03aecdab2668839a1a70a06897788e78e16617e0a802cc742789ed" Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.077402 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2cc2c"] Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.087577 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2cc2c"] Mar 14 09:12:58 crc kubenswrapper[4956]: I0314 09:12:58.089167 4956 scope.go:117] "RemoveContainer" containerID="fc1df94221d44525129f421b964108841e18f2081684894230866f0c8591d7b9" Mar 14 09:12:59 crc kubenswrapper[4956]: I0314 09:12:59.038253 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:59 crc kubenswrapper[4956]: I0314 09:12:59.038337 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:59 crc kubenswrapper[4956]: I0314 09:12:59.080946 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:59 crc kubenswrapper[4956]: I0314 09:12:59.132582 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:12:59 crc kubenswrapper[4956]: I0314 09:12:59.221671 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" path="/var/lib/kubelet/pods/0d890a74-2b52-43f3-91bb-8d1730a21770/volumes" Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.496792 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2qlr"] Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.497410 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r2qlr" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="registry-server" containerID="cri-o://2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009" gracePeriod=2 Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.868404 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.940642 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-utilities\") pod \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.940746 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-catalog-content\") pod \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.940789 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf54v\" (UniqueName: \"kubernetes.io/projected/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-kube-api-access-xf54v\") pod \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\" (UID: \"6df6b026-36f0-4a23-8a8b-42d11e38dfdd\") " Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.941622 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-utilities" (OuterVolumeSpecName: "utilities") pod "6df6b026-36f0-4a23-8a8b-42d11e38dfdd" (UID: "6df6b026-36f0-4a23-8a8b-42d11e38dfdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.945912 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-kube-api-access-xf54v" (OuterVolumeSpecName: "kube-api-access-xf54v") pod "6df6b026-36f0-4a23-8a8b-42d11e38dfdd" (UID: "6df6b026-36f0-4a23-8a8b-42d11e38dfdd"). InnerVolumeSpecName "kube-api-access-xf54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:13:01 crc kubenswrapper[4956]: I0314 09:13:01.970337 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6df6b026-36f0-4a23-8a8b-42d11e38dfdd" (UID: "6df6b026-36f0-4a23-8a8b-42d11e38dfdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.042698 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.042737 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.042748 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf54v\" (UniqueName: \"kubernetes.io/projected/6df6b026-36f0-4a23-8a8b-42d11e38dfdd-kube-api-access-xf54v\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.095757 4956 generic.go:334] "Generic (PLEG): container finished" podID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerID="2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009" exitCode=0 Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.095823 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2qlr" event={"ID":"6df6b026-36f0-4a23-8a8b-42d11e38dfdd","Type":"ContainerDied","Data":"2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009"} Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.095868 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2qlr" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.095885 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2qlr" event={"ID":"6df6b026-36f0-4a23-8a8b-42d11e38dfdd","Type":"ContainerDied","Data":"c3b7d9b0c3ee6727a4e01c2d2b7234e283bfe22f376601583c3fd79b50611d9b"} Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.095908 4956 scope.go:117] "RemoveContainer" containerID="2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.111813 4956 scope.go:117] "RemoveContainer" containerID="0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.128644 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2qlr"] Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.133265 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2qlr"] Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.148037 4956 scope.go:117] "RemoveContainer" containerID="5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.166532 4956 scope.go:117] "RemoveContainer" containerID="2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009" Mar 14 09:13:02 crc kubenswrapper[4956]: E0314 09:13:02.166936 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009\": container with ID starting with 2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009 not found: ID does not exist" containerID="2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.166978 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009"} err="failed to get container status \"2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009\": rpc error: code = NotFound desc = could not find container \"2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009\": container with ID starting with 2a49572530c82eaf6807aa177c05ec393dd374f045e884af1d5013572629c009 not found: ID does not exist" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.167002 4956 scope.go:117] "RemoveContainer" containerID="0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8" Mar 14 09:13:02 crc kubenswrapper[4956]: E0314 09:13:02.167447 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8\": container with ID starting with 0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8 not found: ID does not exist" containerID="0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.167560 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8"} err="failed to get container status \"0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8\": rpc error: code = NotFound desc = could not find container \"0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8\": container with ID starting with 0c96c869d0132616d3c9bb728b219764cd6bafcad3ae510c0b13b44007b2a2f8 not found: ID does not exist" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.167583 4956 scope.go:117] "RemoveContainer" containerID="5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7" Mar 14 09:13:02 crc kubenswrapper[4956]: E0314 09:13:02.167854 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7\": container with ID starting with 5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7 not found: ID does not exist" containerID="5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7" Mar 14 09:13:02 crc kubenswrapper[4956]: I0314 09:13:02.167878 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7"} err="failed to get container status \"5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7\": rpc error: code = NotFound desc = could not find container \"5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7\": container with ID starting with 5624bc7bec07bcf5a204c968ec34dc34a1592fa513222ebb3013cd013212e8f7 not found: ID does not exist" Mar 14 09:13:03 crc kubenswrapper[4956]: I0314 09:13:03.226064 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" path="/var/lib/kubelet/pods/6df6b026-36f0-4a23-8a8b-42d11e38dfdd/volumes" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.904529 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kgjkt"] Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905225 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905513 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905541 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="extract-content" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905554 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="extract-content" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905571 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="extract-utilities" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905582 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="extract-utilities" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905598 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905608 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905629 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="extract-content" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905640 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="extract-content" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905654 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="extract-utilities" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905664 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="extract-utilities" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905680 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905690 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905707 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="extract-utilities" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905717 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="extract-utilities" Mar 14 09:13:06 crc kubenswrapper[4956]: E0314 09:13:06.905733 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="extract-content" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905743 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="extract-content" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.905962 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df6b026-36f0-4a23-8a8b-42d11e38dfdd" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.906011 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d890a74-2b52-43f3-91bb-8d1730a21770" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.906029 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfade054-fee3-4b75-9b0c-5b072fd47a61" containerName="registry-server" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.906710 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.909540 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.909669 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.912623 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kgjkt"] Mar 14 09:13:06 crc kubenswrapper[4956]: I0314 09:13:06.913312 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9njht" Mar 14 09:13:07 crc kubenswrapper[4956]: I0314 09:13:07.009839 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwnx\" (UniqueName: \"kubernetes.io/projected/9ac17263-727b-4bdd-8217-920844d59367-kube-api-access-fwwnx\") pod \"openstack-operator-index-kgjkt\" (UID: \"9ac17263-727b-4bdd-8217-920844d59367\") " pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:07 crc kubenswrapper[4956]: I0314 09:13:07.111461 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwnx\" (UniqueName: \"kubernetes.io/projected/9ac17263-727b-4bdd-8217-920844d59367-kube-api-access-fwwnx\") pod \"openstack-operator-index-kgjkt\" (UID: \"9ac17263-727b-4bdd-8217-920844d59367\") " pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:07 crc kubenswrapper[4956]: I0314 09:13:07.148221 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwnx\" (UniqueName: \"kubernetes.io/projected/9ac17263-727b-4bdd-8217-920844d59367-kube-api-access-fwwnx\") pod \"openstack-operator-index-kgjkt\" (UID: \"9ac17263-727b-4bdd-8217-920844d59367\") " pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:07 crc kubenswrapper[4956]: I0314 09:13:07.228706 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:07 crc kubenswrapper[4956]: I0314 09:13:07.421046 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kgjkt"] Mar 14 09:13:07 crc kubenswrapper[4956]: W0314 09:13:07.427657 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac17263_727b_4bdd_8217_920844d59367.slice/crio-00f0a3e2db4c7cf3262e447cf560b95f42f6362e150357f53edc0b144c305934 WatchSource:0}: Error finding container 00f0a3e2db4c7cf3262e447cf560b95f42f6362e150357f53edc0b144c305934: Status 404 returned error can't find the container with id 00f0a3e2db4c7cf3262e447cf560b95f42f6362e150357f53edc0b144c305934 Mar 14 09:13:08 crc kubenswrapper[4956]: I0314 09:13:08.156755 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kgjkt" event={"ID":"9ac17263-727b-4bdd-8217-920844d59367","Type":"ContainerStarted","Data":"00f0a3e2db4c7cf3262e447cf560b95f42f6362e150357f53edc0b144c305934"} Mar 14 09:13:10 crc kubenswrapper[4956]: I0314 09:13:10.171455 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kgjkt" event={"ID":"9ac17263-727b-4bdd-8217-920844d59367","Type":"ContainerStarted","Data":"5a35b8f8183fa18ab9744a031577b64bbeeee2edde3a2f35dd0fe2c512257ca5"} Mar 14 09:13:10 crc kubenswrapper[4956]: I0314 09:13:10.189883 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kgjkt" podStartSLOduration=1.754972582 podStartE2EDuration="4.189864542s" podCreationTimestamp="2026-03-14 09:13:06 +0000 UTC" firstStartedPulling="2026-03-14 09:13:07.429640355 +0000 UTC m=+992.942332623" lastFinishedPulling="2026-03-14 09:13:09.864532315 +0000 UTC m=+995.377224583" observedRunningTime="2026-03-14 09:13:10.185252527 +0000 UTC m=+995.697944795" watchObservedRunningTime="2026-03-14 09:13:10.189864542 +0000 UTC m=+995.702556810" Mar 14 09:13:17 crc kubenswrapper[4956]: I0314 09:13:17.229403 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:17 crc kubenswrapper[4956]: I0314 09:13:17.229943 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:17 crc kubenswrapper[4956]: I0314 09:13:17.259274 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:18 crc kubenswrapper[4956]: I0314 09:13:18.260703 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-kgjkt" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.424204 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.425528 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.734396 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb"] Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.735837 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.738175 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nndhm" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.746673 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb"] Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.756959 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-util\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.757004 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-bundle\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.757085 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9vg\" (UniqueName: \"kubernetes.io/projected/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-kube-api-access-pj9vg\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.857959 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9vg\" (UniqueName: \"kubernetes.io/projected/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-kube-api-access-pj9vg\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.858277 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-util\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.858400 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-bundle\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.859188 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-bundle\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.859604 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-util\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:25 crc kubenswrapper[4956]: I0314 09:13:25.877771 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9vg\" (UniqueName: \"kubernetes.io/projected/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-kube-api-access-pj9vg\") pod \"62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:26 crc kubenswrapper[4956]: I0314 09:13:26.060000 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:26 crc kubenswrapper[4956]: I0314 09:13:26.463070 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb"] Mar 14 09:13:27 crc kubenswrapper[4956]: I0314 09:13:27.280173 4956 generic.go:334] "Generic (PLEG): container finished" podID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerID="d724ca52322db23bbcfeaaa7317f99b2c9b7909b7a845ebc2059e85d1c314154" exitCode=0 Mar 14 09:13:27 crc kubenswrapper[4956]: I0314 09:13:27.280348 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" event={"ID":"aec90a2d-58f3-4dee-90c3-60a8fc90bc56","Type":"ContainerDied","Data":"d724ca52322db23bbcfeaaa7317f99b2c9b7909b7a845ebc2059e85d1c314154"} Mar 14 09:13:27 crc kubenswrapper[4956]: I0314 09:13:27.280504 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" event={"ID":"aec90a2d-58f3-4dee-90c3-60a8fc90bc56","Type":"ContainerStarted","Data":"5fa520a675158fc3f4927860cb8603ffdf87d5342e20be1a2347dd8d7c06b591"} Mar 14 09:13:28 crc kubenswrapper[4956]: I0314 09:13:28.289962 4956 generic.go:334] "Generic (PLEG): container finished" podID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerID="d9667018dff0033afde99430ca716bd6971f8c4417cad4c567dffbf26269a0dd" exitCode=0 Mar 14 09:13:28 crc kubenswrapper[4956]: I0314 09:13:28.290022 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" event={"ID":"aec90a2d-58f3-4dee-90c3-60a8fc90bc56","Type":"ContainerDied","Data":"d9667018dff0033afde99430ca716bd6971f8c4417cad4c567dffbf26269a0dd"} Mar 14 09:13:29 crc kubenswrapper[4956]: I0314 09:13:29.298897 4956 generic.go:334] "Generic (PLEG): container finished" podID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerID="665fe9bfcc09dfc5f8d1fb8fe31b05ecfc7ffb7a7d93eefe3fbcef3e82e97c4d" exitCode=0 Mar 14 09:13:29 crc kubenswrapper[4956]: I0314 09:13:29.298943 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" event={"ID":"aec90a2d-58f3-4dee-90c3-60a8fc90bc56","Type":"ContainerDied","Data":"665fe9bfcc09dfc5f8d1fb8fe31b05ecfc7ffb7a7d93eefe3fbcef3e82e97c4d"} Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.558299 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.628468 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-util\") pod \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.628568 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-bundle\") pod \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.628670 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj9vg\" (UniqueName: \"kubernetes.io/projected/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-kube-api-access-pj9vg\") pod \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\" (UID: \"aec90a2d-58f3-4dee-90c3-60a8fc90bc56\") " Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.629292 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-bundle" (OuterVolumeSpecName: "bundle") pod "aec90a2d-58f3-4dee-90c3-60a8fc90bc56" (UID: "aec90a2d-58f3-4dee-90c3-60a8fc90bc56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.633189 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-kube-api-access-pj9vg" (OuterVolumeSpecName: "kube-api-access-pj9vg") pod "aec90a2d-58f3-4dee-90c3-60a8fc90bc56" (UID: "aec90a2d-58f3-4dee-90c3-60a8fc90bc56"). InnerVolumeSpecName "kube-api-access-pj9vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.646476 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-util" (OuterVolumeSpecName: "util") pod "aec90a2d-58f3-4dee-90c3-60a8fc90bc56" (UID: "aec90a2d-58f3-4dee-90c3-60a8fc90bc56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.731152 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.731282 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:30 crc kubenswrapper[4956]: I0314 09:13:30.731296 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj9vg\" (UniqueName: \"kubernetes.io/projected/aec90a2d-58f3-4dee-90c3-60a8fc90bc56-kube-api-access-pj9vg\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:31 crc kubenswrapper[4956]: I0314 09:13:31.317766 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" event={"ID":"aec90a2d-58f3-4dee-90c3-60a8fc90bc56","Type":"ContainerDied","Data":"5fa520a675158fc3f4927860cb8603ffdf87d5342e20be1a2347dd8d7c06b591"} Mar 14 09:13:31 crc kubenswrapper[4956]: I0314 09:13:31.317811 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa520a675158fc3f4927860cb8603ffdf87d5342e20be1a2347dd8d7c06b591" Mar 14 09:13:31 crc kubenswrapper[4956]: I0314 09:13:31.317894 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.910186 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q"] Mar 14 09:13:37 crc kubenswrapper[4956]: E0314 09:13:37.911169 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="pull" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.911186 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="pull" Mar 14 09:13:37 crc kubenswrapper[4956]: E0314 09:13:37.911205 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="extract" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.911214 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="extract" Mar 14 09:13:37 crc kubenswrapper[4956]: E0314 09:13:37.911234 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="util" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.911243 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="util" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.911420 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec90a2d-58f3-4dee-90c3-60a8fc90bc56" containerName="extract" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.912092 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.914638 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-8mvj8" Mar 14 09:13:37 crc kubenswrapper[4956]: I0314 09:13:37.933615 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q"] Mar 14 09:13:38 crc kubenswrapper[4956]: I0314 09:13:38.025175 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvw59\" (UniqueName: \"kubernetes.io/projected/abca7d4b-14a7-431e-8b05-66a118ab327e-kube-api-access-dvw59\") pod \"openstack-operator-controller-init-6ccbf6d758-tgm5q\" (UID: \"abca7d4b-14a7-431e-8b05-66a118ab327e\") " pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:38 crc kubenswrapper[4956]: I0314 09:13:38.126336 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvw59\" (UniqueName: \"kubernetes.io/projected/abca7d4b-14a7-431e-8b05-66a118ab327e-kube-api-access-dvw59\") pod \"openstack-operator-controller-init-6ccbf6d758-tgm5q\" (UID: \"abca7d4b-14a7-431e-8b05-66a118ab327e\") " pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:38 crc kubenswrapper[4956]: I0314 09:13:38.152089 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvw59\" (UniqueName: \"kubernetes.io/projected/abca7d4b-14a7-431e-8b05-66a118ab327e-kube-api-access-dvw59\") pod \"openstack-operator-controller-init-6ccbf6d758-tgm5q\" (UID: \"abca7d4b-14a7-431e-8b05-66a118ab327e\") " pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:38 crc kubenswrapper[4956]: I0314 09:13:38.237735 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:38 crc kubenswrapper[4956]: I0314 09:13:38.790113 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q"] Mar 14 09:13:38 crc kubenswrapper[4956]: W0314 09:13:38.793415 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabca7d4b_14a7_431e_8b05_66a118ab327e.slice/crio-a726edc093df3219d78619d68d8889556fe20f42f4b52a497efd0ef37bb39de7 WatchSource:0}: Error finding container a726edc093df3219d78619d68d8889556fe20f42f4b52a497efd0ef37bb39de7: Status 404 returned error can't find the container with id a726edc093df3219d78619d68d8889556fe20f42f4b52a497efd0ef37bb39de7 Mar 14 09:13:39 crc kubenswrapper[4956]: I0314 09:13:39.604099 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" event={"ID":"abca7d4b-14a7-431e-8b05-66a118ab327e","Type":"ContainerStarted","Data":"a726edc093df3219d78619d68d8889556fe20f42f4b52a497efd0ef37bb39de7"} Mar 14 09:13:42 crc kubenswrapper[4956]: I0314 09:13:42.626877 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" event={"ID":"abca7d4b-14a7-431e-8b05-66a118ab327e","Type":"ContainerStarted","Data":"dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6"} Mar 14 09:13:42 crc kubenswrapper[4956]: I0314 09:13:42.628286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:48 crc kubenswrapper[4956]: I0314 09:13:48.240186 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:13:48 crc kubenswrapper[4956]: I0314 09:13:48.271918 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" podStartSLOduration=7.662651776 podStartE2EDuration="11.271900062s" podCreationTimestamp="2026-03-14 09:13:37 +0000 UTC" firstStartedPulling="2026-03-14 09:13:38.796327526 +0000 UTC m=+1024.309019784" lastFinishedPulling="2026-03-14 09:13:42.405575802 +0000 UTC m=+1027.918268070" observedRunningTime="2026-03-14 09:13:42.656376574 +0000 UTC m=+1028.169068862" watchObservedRunningTime="2026-03-14 09:13:48.271900062 +0000 UTC m=+1033.784592330" Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.423910 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.424538 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.424595 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.425304 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0530da8ab6e9909827338d4f4090fb9eca5500f5a446223956b17c49ce8aec4"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.425357 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://c0530da8ab6e9909827338d4f4090fb9eca5500f5a446223956b17c49ce8aec4" gracePeriod=600 Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.731832 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="c0530da8ab6e9909827338d4f4090fb9eca5500f5a446223956b17c49ce8aec4" exitCode=0 Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.731883 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"c0530da8ab6e9909827338d4f4090fb9eca5500f5a446223956b17c49ce8aec4"} Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.732200 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"627cfd99357e3b66570faed20a1ce7ae2cc7c510054d839a6cb159952f60d6be"} Mar 14 09:13:55 crc kubenswrapper[4956]: I0314 09:13:55.732219 4956 scope.go:117] "RemoveContainer" containerID="a47c6c00ee731cd8c4ec91df013e1936bd43fead85afc21e044951f9ea4d95f7" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.138853 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557994-zvc72"] Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.140528 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.147124 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.147240 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.147389 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.152450 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-zvc72"] Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.216312 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4qx\" (UniqueName: \"kubernetes.io/projected/414e50d4-0a21-488c-b82c-95e9efd119cb-kube-api-access-pd4qx\") pod \"auto-csr-approver-29557994-zvc72\" (UID: \"414e50d4-0a21-488c-b82c-95e9efd119cb\") " pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.317402 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4qx\" (UniqueName: \"kubernetes.io/projected/414e50d4-0a21-488c-b82c-95e9efd119cb-kube-api-access-pd4qx\") pod \"auto-csr-approver-29557994-zvc72\" (UID: \"414e50d4-0a21-488c-b82c-95e9efd119cb\") " pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.387226 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4qx\" (UniqueName: \"kubernetes.io/projected/414e50d4-0a21-488c-b82c-95e9efd119cb-kube-api-access-pd4qx\") pod \"auto-csr-approver-29557994-zvc72\" (UID: \"414e50d4-0a21-488c-b82c-95e9efd119cb\") " pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.467278 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:00 crc kubenswrapper[4956]: I0314 09:14:00.877578 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-zvc72"] Mar 14 09:14:01 crc kubenswrapper[4956]: I0314 09:14:01.769805 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-zvc72" event={"ID":"414e50d4-0a21-488c-b82c-95e9efd119cb","Type":"ContainerStarted","Data":"2daddbff6936d5d27f1497bac564b7cf39a68012b4a173dd128747b9ccb4a02b"} Mar 14 09:14:05 crc kubenswrapper[4956]: I0314 09:14:05.801122 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-zvc72" event={"ID":"414e50d4-0a21-488c-b82c-95e9efd119cb","Type":"ContainerStarted","Data":"4e2c2598dc3408c30bd4ddb0a6cf96b5e84167b0b935a8e931ac0c0d3f706d36"} Mar 14 09:14:06 crc kubenswrapper[4956]: I0314 09:14:06.809918 4956 generic.go:334] "Generic (PLEG): container finished" podID="414e50d4-0a21-488c-b82c-95e9efd119cb" containerID="4e2c2598dc3408c30bd4ddb0a6cf96b5e84167b0b935a8e931ac0c0d3f706d36" exitCode=0 Mar 14 09:14:06 crc kubenswrapper[4956]: I0314 09:14:06.810233 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-zvc72" event={"ID":"414e50d4-0a21-488c-b82c-95e9efd119cb","Type":"ContainerDied","Data":"4e2c2598dc3408c30bd4ddb0a6cf96b5e84167b0b935a8e931ac0c0d3f706d36"} Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.163227 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.282170 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm"] Mar 14 09:14:08 crc kubenswrapper[4956]: E0314 09:14:08.282568 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414e50d4-0a21-488c-b82c-95e9efd119cb" containerName="oc" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.282587 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="414e50d4-0a21-488c-b82c-95e9efd119cb" containerName="oc" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.282755 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="414e50d4-0a21-488c-b82c-95e9efd119cb" containerName="oc" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.283354 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.285470 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qqvkd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.291788 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.292667 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.297010 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-859nm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.302227 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.312522 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.329552 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.330436 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4qx\" (UniqueName: \"kubernetes.io/projected/414e50d4-0a21-488c-b82c-95e9efd119cb-kube-api-access-pd4qx\") pod \"414e50d4-0a21-488c-b82c-95e9efd119cb\" (UID: \"414e50d4-0a21-488c-b82c-95e9efd119cb\") " Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.330607 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.336459 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zhvnl" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.342561 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.354159 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.355039 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.356688 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414e50d4-0a21-488c-b82c-95e9efd119cb-kube-api-access-pd4qx" (OuterVolumeSpecName: "kube-api-access-pd4qx") pod "414e50d4-0a21-488c-b82c-95e9efd119cb" (UID: "414e50d4-0a21-488c-b82c-95e9efd119cb"). InnerVolumeSpecName "kube-api-access-pd4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.357355 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-wvvvh" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.365579 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.367732 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.373172 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-sn6gc" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.385595 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.396275 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.404380 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.405579 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.408159 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dbbnd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.432241 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6kkz\" (UniqueName: \"kubernetes.io/projected/9bf45de7-ba46-4ce9-a7d7-fc26e253423b-kube-api-access-f6kkz\") pod \"cinder-operator-controller-manager-cb6d66846-b5jzw\" (UID: \"9bf45de7-ba46-4ce9-a7d7-fc26e253423b\") " pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.432313 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ft5g\" (UniqueName: \"kubernetes.io/projected/b3749fc9-e22e-42b9-8865-68679f7d78f1-kube-api-access-7ft5g\") pod \"barbican-operator-controller-manager-64768694d-m9dwm\" (UID: \"b3749fc9-e22e-42b9-8865-68679f7d78f1\") " pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.432348 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnmj\" (UniqueName: \"kubernetes.io/projected/6d1aed1b-6436-46ca-a824-59eafb8ca5d3-kube-api-access-qfnmj\") pod \"designate-operator-controller-manager-9c8c85cd7-8xwbp\" (UID: \"6d1aed1b-6436-46ca-a824-59eafb8ca5d3\") " pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.432403 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4qx\" (UniqueName: \"kubernetes.io/projected/414e50d4-0a21-488c-b82c-95e9efd119cb-kube-api-access-pd4qx\") on node \"crc\" DevicePath \"\"" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.433765 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.439235 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.443316 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.443670 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-sp7ht" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.443819 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.450231 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.470967 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.472012 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.480144 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-wjbbd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.480361 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.481777 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.484601 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dmb8w" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.503366 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.514259 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.515117 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.517507 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4bn47" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.526386 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533159 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnmj\" (UniqueName: \"kubernetes.io/projected/6d1aed1b-6436-46ca-a824-59eafb8ca5d3-kube-api-access-qfnmj\") pod \"designate-operator-controller-manager-9c8c85cd7-8xwbp\" (UID: \"6d1aed1b-6436-46ca-a824-59eafb8ca5d3\") " pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533300 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxwkm\" (UniqueName: \"kubernetes.io/projected/876c14aa-a86a-495f-a110-3ade7d8d69fb-kube-api-access-hxwkm\") pod \"heat-operator-controller-manager-6d6bd468b-db2v8\" (UID: \"876c14aa-a86a-495f-a110-3ade7d8d69fb\") " pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533334 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6kkz\" (UniqueName: \"kubernetes.io/projected/9bf45de7-ba46-4ce9-a7d7-fc26e253423b-kube-api-access-f6kkz\") pod \"cinder-operator-controller-manager-cb6d66846-b5jzw\" (UID: \"9bf45de7-ba46-4ce9-a7d7-fc26e253423b\") " pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533372 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrtz8\" (UniqueName: \"kubernetes.io/projected/4492ee98-efe4-49c3-8c14-86453a8e8714-kube-api-access-wrtz8\") pod \"horizon-operator-controller-manager-5b9475cdd7-586s5\" (UID: \"4492ee98-efe4-49c3-8c14-86453a8e8714\") " pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533440 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrxt\" (UniqueName: \"kubernetes.io/projected/18021394-f27d-422e-a68c-24a19d74ceb8-kube-api-access-qqrxt\") pod \"glance-operator-controller-manager-74d565fbd5-c924b\" (UID: \"18021394-f27d-422e-a68c-24a19d74ceb8\") " pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533471 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ft5g\" (UniqueName: \"kubernetes.io/projected/b3749fc9-e22e-42b9-8865-68679f7d78f1-kube-api-access-7ft5g\") pod \"barbican-operator-controller-manager-64768694d-m9dwm\" (UID: \"b3749fc9-e22e-42b9-8865-68679f7d78f1\") " pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533519 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.533557 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffmh\" (UniqueName: \"kubernetes.io/projected/e43b6f19-e463-43ce-9efe-5cefa3b53682-kube-api-access-8ffmh\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.534272 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.566234 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.567134 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.569572 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sd8m7" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.571321 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnmj\" (UniqueName: \"kubernetes.io/projected/6d1aed1b-6436-46ca-a824-59eafb8ca5d3-kube-api-access-qfnmj\") pod \"designate-operator-controller-manager-9c8c85cd7-8xwbp\" (UID: \"6d1aed1b-6436-46ca-a824-59eafb8ca5d3\") " pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.575557 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.576444 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.577454 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6kkz\" (UniqueName: \"kubernetes.io/projected/9bf45de7-ba46-4ce9-a7d7-fc26e253423b-kube-api-access-f6kkz\") pod \"cinder-operator-controller-manager-cb6d66846-b5jzw\" (UID: \"9bf45de7-ba46-4ce9-a7d7-fc26e253423b\") " pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.578149 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4chnp" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.584741 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.587292 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.590683 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ntwfd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.595280 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.601548 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.614044 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ft5g\" (UniqueName: \"kubernetes.io/projected/b3749fc9-e22e-42b9-8865-68679f7d78f1-kube-api-access-7ft5g\") pod \"barbican-operator-controller-manager-64768694d-m9dwm\" (UID: \"b3749fc9-e22e-42b9-8865-68679f7d78f1\") " pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.626953 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrtz8\" (UniqueName: \"kubernetes.io/projected/4492ee98-efe4-49c3-8c14-86453a8e8714-kube-api-access-wrtz8\") pod \"horizon-operator-controller-manager-5b9475cdd7-586s5\" (UID: \"4492ee98-efe4-49c3-8c14-86453a8e8714\") " pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643437 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbd9m\" (UniqueName: \"kubernetes.io/projected/19ee7ede-7bda-46bc-8413-95262fa53969-kube-api-access-sbd9m\") pod \"manila-operator-controller-manager-6f6f57b9b6-9c7lm\" (UID: \"19ee7ede-7bda-46bc-8413-95262fa53969\") " pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643476 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrxt\" (UniqueName: \"kubernetes.io/projected/18021394-f27d-422e-a68c-24a19d74ceb8-kube-api-access-qqrxt\") pod \"glance-operator-controller-manager-74d565fbd5-c924b\" (UID: \"18021394-f27d-422e-a68c-24a19d74ceb8\") " pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643524 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2x9r\" (UniqueName: \"kubernetes.io/projected/18ad86e0-d070-4de1-bd50-a93f9abdf715-kube-api-access-m2x9r\") pod \"keystone-operator-controller-manager-68f8d496f8-4knlv\" (UID: \"18ad86e0-d070-4de1-bd50-a93f9abdf715\") " pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643551 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643587 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffmh\" (UniqueName: \"kubernetes.io/projected/e43b6f19-e463-43ce-9efe-5cefa3b53682-kube-api-access-8ffmh\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643638 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfsw\" (UniqueName: \"kubernetes.io/projected/401a3c6d-db2a-435b-b7f5-08816736d895-kube-api-access-jvfsw\") pod \"ironic-operator-controller-manager-bf6b7fd8c-56k2s\" (UID: \"401a3c6d-db2a-435b-b7f5-08816736d895\") " pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.643668 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxwkm\" (UniqueName: \"kubernetes.io/projected/876c14aa-a86a-495f-a110-3ade7d8d69fb-kube-api-access-hxwkm\") pod \"heat-operator-controller-manager-6d6bd468b-db2v8\" (UID: \"876c14aa-a86a-495f-a110-3ade7d8d69fb\") " pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:08 crc kubenswrapper[4956]: E0314 09:14:08.644238 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:08 crc kubenswrapper[4956]: E0314 09:14:08.644287 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert podName:e43b6f19-e463-43ce-9efe-5cefa3b53682 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:09.144271778 +0000 UTC m=+1054.656964046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert") pod "infra-operator-controller-manager-fbfb5bd65-v7cqm" (UID: "e43b6f19-e463-43ce-9efe-5cefa3b53682") : secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.645854 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.647798 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.653919 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8hvxt" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.660612 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.678399 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrxt\" (UniqueName: \"kubernetes.io/projected/18021394-f27d-422e-a68c-24a19d74ceb8-kube-api-access-qqrxt\") pod \"glance-operator-controller-manager-74d565fbd5-c924b\" (UID: \"18021394-f27d-422e-a68c-24a19d74ceb8\") " pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.679899 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffmh\" (UniqueName: \"kubernetes.io/projected/e43b6f19-e463-43ce-9efe-5cefa3b53682-kube-api-access-8ffmh\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.683022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrtz8\" (UniqueName: \"kubernetes.io/projected/4492ee98-efe4-49c3-8c14-86453a8e8714-kube-api-access-wrtz8\") pod \"horizon-operator-controller-manager-5b9475cdd7-586s5\" (UID: \"4492ee98-efe4-49c3-8c14-86453a8e8714\") " pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.683292 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.684071 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.685846 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxwkm\" (UniqueName: \"kubernetes.io/projected/876c14aa-a86a-495f-a110-3ade7d8d69fb-kube-api-access-hxwkm\") pod \"heat-operator-controller-manager-6d6bd468b-db2v8\" (UID: \"876c14aa-a86a-495f-a110-3ade7d8d69fb\") " pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.713767 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.729817 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.732787 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.740226 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.742143 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.743006 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.743051 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-c8dww" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744433 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d9ll\" (UniqueName: \"kubernetes.io/projected/fdc31f77-84df-4657-b5ec-a7fcd8b673e2-kube-api-access-6d9ll\") pod \"nova-operator-controller-manager-58ff56fcc7-9kjv4\" (UID: \"fdc31f77-84df-4657-b5ec-a7fcd8b673e2\") " pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744513 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbd9m\" (UniqueName: \"kubernetes.io/projected/19ee7ede-7bda-46bc-8413-95262fa53969-kube-api-access-sbd9m\") pod \"manila-operator-controller-manager-6f6f57b9b6-9c7lm\" (UID: \"19ee7ede-7bda-46bc-8413-95262fa53969\") " pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744548 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgq7h\" (UniqueName: \"kubernetes.io/projected/240f1b23-5499-4644-a442-9647e71a33d4-kube-api-access-sgq7h\") pod \"neutron-operator-controller-manager-645c9f6488-9bn7h\" (UID: \"240f1b23-5499-4644-a442-9647e71a33d4\") " pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744576 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2x9r\" (UniqueName: \"kubernetes.io/projected/18ad86e0-d070-4de1-bd50-a93f9abdf715-kube-api-access-m2x9r\") pod \"keystone-operator-controller-manager-68f8d496f8-4knlv\" (UID: \"18ad86e0-d070-4de1-bd50-a93f9abdf715\") " pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744638 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qflkd\" (UniqueName: \"kubernetes.io/projected/e0b8fc0f-4ffa-4c26-84e4-9613f5161286-kube-api-access-qflkd\") pod \"octavia-operator-controller-manager-7cf9f49d6-qwdtd\" (UID: \"e0b8fc0f-4ffa-4c26-84e4-9613f5161286\") " pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744669 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh59w\" (UniqueName: \"kubernetes.io/projected/03bd3900-f5fa-476e-a91a-f492e4a424dc-kube-api-access-mh59w\") pod \"mariadb-operator-controller-manager-744456f686-nnfnk\" (UID: \"03bd3900-f5fa-476e-a91a-f492e4a424dc\") " pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.744701 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfsw\" (UniqueName: \"kubernetes.io/projected/401a3c6d-db2a-435b-b7f5-08816736d895-kube-api-access-jvfsw\") pod \"ironic-operator-controller-manager-bf6b7fd8c-56k2s\" (UID: \"401a3c6d-db2a-435b-b7f5-08816736d895\") " pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.749105 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.766619 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.772880 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.774189 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfsw\" (UniqueName: \"kubernetes.io/projected/401a3c6d-db2a-435b-b7f5-08816736d895-kube-api-access-jvfsw\") pod \"ironic-operator-controller-manager-bf6b7fd8c-56k2s\" (UID: \"401a3c6d-db2a-435b-b7f5-08816736d895\") " pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.777438 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-t4mf6" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.779801 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.784652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2x9r\" (UniqueName: \"kubernetes.io/projected/18ad86e0-d070-4de1-bd50-a93f9abdf715-kube-api-access-m2x9r\") pod \"keystone-operator-controller-manager-68f8d496f8-4knlv\" (UID: \"18ad86e0-d070-4de1-bd50-a93f9abdf715\") " pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.786979 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbd9m\" (UniqueName: \"kubernetes.io/projected/19ee7ede-7bda-46bc-8413-95262fa53969-kube-api-access-sbd9m\") pod \"manila-operator-controller-manager-6f6f57b9b6-9c7lm\" (UID: \"19ee7ede-7bda-46bc-8413-95262fa53969\") " pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.793609 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.794761 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.801394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.803827 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tt9gz" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.804421 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.822543 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.823521 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.829760 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mb5ll" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.830925 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.832564 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-zvc72" event={"ID":"414e50d4-0a21-488c-b82c-95e9efd119cb","Type":"ContainerDied","Data":"2daddbff6936d5d27f1497bac564b7cf39a68012b4a173dd128747b9ccb4a02b"} Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.832591 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2daddbff6936d5d27f1497bac564b7cf39a68012b4a173dd128747b9ccb4a02b" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.832726 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-zvc72" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.838443 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.838822 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849278 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d9ll\" (UniqueName: \"kubernetes.io/projected/fdc31f77-84df-4657-b5ec-a7fcd8b673e2-kube-api-access-6d9ll\") pod \"nova-operator-controller-manager-58ff56fcc7-9kjv4\" (UID: \"fdc31f77-84df-4657-b5ec-a7fcd8b673e2\") " pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849342 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8rzm\" (UniqueName: \"kubernetes.io/projected/a8d7db47-dd9b-4785-8e80-d7d97a324225-kube-api-access-x8rzm\") pod \"ovn-operator-controller-manager-848d74f969-6n97h\" (UID: \"a8d7db47-dd9b-4785-8e80-d7d97a324225\") " pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849379 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgq7h\" (UniqueName: \"kubernetes.io/projected/240f1b23-5499-4644-a442-9647e71a33d4-kube-api-access-sgq7h\") pod \"neutron-operator-controller-manager-645c9f6488-9bn7h\" (UID: \"240f1b23-5499-4644-a442-9647e71a33d4\") " pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849416 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849433 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6lsq\" (UniqueName: \"kubernetes.io/projected/67182d33-3abc-4661-8614-94238efc9e45-kube-api-access-c6lsq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849470 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qflkd\" (UniqueName: \"kubernetes.io/projected/e0b8fc0f-4ffa-4c26-84e4-9613f5161286-kube-api-access-qflkd\") pod \"octavia-operator-controller-manager-7cf9f49d6-qwdtd\" (UID: \"e0b8fc0f-4ffa-4c26-84e4-9613f5161286\") " pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.849510 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh59w\" (UniqueName: \"kubernetes.io/projected/03bd3900-f5fa-476e-a91a-f492e4a424dc-kube-api-access-mh59w\") pod \"mariadb-operator-controller-manager-744456f686-nnfnk\" (UID: \"03bd3900-f5fa-476e-a91a-f492e4a424dc\") " pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.873525 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.874578 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.880978 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh59w\" (UniqueName: \"kubernetes.io/projected/03bd3900-f5fa-476e-a91a-f492e4a424dc-kube-api-access-mh59w\") pod \"mariadb-operator-controller-manager-744456f686-nnfnk\" (UID: \"03bd3900-f5fa-476e-a91a-f492e4a424dc\") " pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.882152 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hs5hw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.885403 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qflkd\" (UniqueName: \"kubernetes.io/projected/e0b8fc0f-4ffa-4c26-84e4-9613f5161286-kube-api-access-qflkd\") pod \"octavia-operator-controller-manager-7cf9f49d6-qwdtd\" (UID: \"e0b8fc0f-4ffa-4c26-84e4-9613f5161286\") " pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.895297 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d9ll\" (UniqueName: \"kubernetes.io/projected/fdc31f77-84df-4657-b5ec-a7fcd8b673e2-kube-api-access-6d9ll\") pod \"nova-operator-controller-manager-58ff56fcc7-9kjv4\" (UID: \"fdc31f77-84df-4657-b5ec-a7fcd8b673e2\") " pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.904052 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.915654 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.917061 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgq7h\" (UniqueName: \"kubernetes.io/projected/240f1b23-5499-4644-a442-9647e71a33d4-kube-api-access-sgq7h\") pod \"neutron-operator-controller-manager-645c9f6488-9bn7h\" (UID: \"240f1b23-5499-4644-a442-9647e71a33d4\") " pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.940360 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz"] Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.941609 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.943405 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mxsg7" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.948461 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.951015 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljv95\" (UniqueName: \"kubernetes.io/projected/24e7f369-d268-4aa7-89f9-1b4ce48fd197-kube-api-access-ljv95\") pod \"swift-operator-controller-manager-7f7469dbc6-6djhw\" (UID: \"24e7f369-d268-4aa7-89f9-1b4ce48fd197\") " pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.951090 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7h9q\" (UniqueName: \"kubernetes.io/projected/bad25fda-3055-4fa2-8fd4-24980a88c7c6-kube-api-access-j7h9q\") pod \"placement-operator-controller-manager-b5c469fd-bffxq\" (UID: \"bad25fda-3055-4fa2-8fd4-24980a88c7c6\") " pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.951121 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8rzm\" (UniqueName: \"kubernetes.io/projected/a8d7db47-dd9b-4785-8e80-d7d97a324225-kube-api-access-x8rzm\") pod \"ovn-operator-controller-manager-848d74f969-6n97h\" (UID: \"a8d7db47-dd9b-4785-8e80-d7d97a324225\") " pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.951167 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.951187 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6lsq\" (UniqueName: \"kubernetes.io/projected/67182d33-3abc-4661-8614-94238efc9e45-kube-api-access-c6lsq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.951558 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz"] Mar 14 09:14:08 crc kubenswrapper[4956]: E0314 09:14:08.951718 4956 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:08 crc kubenswrapper[4956]: E0314 09:14:08.951800 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert podName:67182d33-3abc-4661-8614-94238efc9e45 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:09.451781537 +0000 UTC m=+1054.964473805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" (UID: "67182d33-3abc-4661-8614-94238efc9e45") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.974699 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.975234 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8rzm\" (UniqueName: \"kubernetes.io/projected/a8d7db47-dd9b-4785-8e80-d7d97a324225-kube-api-access-x8rzm\") pod \"ovn-operator-controller-manager-848d74f969-6n97h\" (UID: \"a8d7db47-dd9b-4785-8e80-d7d97a324225\") " pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.990790 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:08 crc kubenswrapper[4956]: I0314 09:14:08.996248 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6lsq\" (UniqueName: \"kubernetes.io/projected/67182d33-3abc-4661-8614-94238efc9e45-kube-api-access-c6lsq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.005626 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.006527 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.008204 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-w27pd" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.011950 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.015659 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.034157 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.035245 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.040459 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wwlv9" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.040644 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.041054 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.055152 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljv95\" (UniqueName: \"kubernetes.io/projected/24e7f369-d268-4aa7-89f9-1b4ce48fd197-kube-api-access-ljv95\") pod \"swift-operator-controller-manager-7f7469dbc6-6djhw\" (UID: \"24e7f369-d268-4aa7-89f9-1b4ce48fd197\") " pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.055216 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxsj\" (UniqueName: \"kubernetes.io/projected/2c1f44c6-aae3-4c3c-933e-c87956fb0fe6-kube-api-access-kmxsj\") pod \"telemetry-operator-controller-manager-6646df7cdb-tspwx\" (UID: \"2c1f44c6-aae3-4c3c-933e-c87956fb0fe6\") " pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.055266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x2sw\" (UniqueName: \"kubernetes.io/projected/f1b524ae-60fa-48a7-aa07-bf354bd2ff62-kube-api-access-8x2sw\") pod \"test-operator-controller-manager-8467ccb4c8-r9dtz\" (UID: \"f1b524ae-60fa-48a7-aa07-bf354bd2ff62\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.055294 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7h9q\" (UniqueName: \"kubernetes.io/projected/bad25fda-3055-4fa2-8fd4-24980a88c7c6-kube-api-access-j7h9q\") pod \"placement-operator-controller-manager-b5c469fd-bffxq\" (UID: \"bad25fda-3055-4fa2-8fd4-24980a88c7c6\") " pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.062869 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.109771 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.132953 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.134129 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljv95\" (UniqueName: \"kubernetes.io/projected/24e7f369-d268-4aa7-89f9-1b4ce48fd197-kube-api-access-ljv95\") pod \"swift-operator-controller-manager-7f7469dbc6-6djhw\" (UID: \"24e7f369-d268-4aa7-89f9-1b4ce48fd197\") " pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.134751 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.139381 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7h9q\" (UniqueName: \"kubernetes.io/projected/bad25fda-3055-4fa2-8fd4-24980a88c7c6-kube-api-access-j7h9q\") pod \"placement-operator-controller-manager-b5c469fd-bffxq\" (UID: \"bad25fda-3055-4fa2-8fd4-24980a88c7c6\") " pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.162265 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-b8wbp" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165413 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jvk\" (UniqueName: \"kubernetes.io/projected/6bff6422-f245-457e-9ddb-28f957c9edac-kube-api-access-f2jvk\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165611 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165676 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxsj\" (UniqueName: \"kubernetes.io/projected/2c1f44c6-aae3-4c3c-933e-c87956fb0fe6-kube-api-access-kmxsj\") pod \"telemetry-operator-controller-manager-6646df7cdb-tspwx\" (UID: \"2c1f44c6-aae3-4c3c-933e-c87956fb0fe6\") " pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165733 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x2sw\" (UniqueName: \"kubernetes.io/projected/f1b524ae-60fa-48a7-aa07-bf354bd2ff62-kube-api-access-8x2sw\") pod \"test-operator-controller-manager-8467ccb4c8-r9dtz\" (UID: \"f1b524ae-60fa-48a7-aa07-bf354bd2ff62\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165825 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.165930 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtw8\" (UniqueName: \"kubernetes.io/projected/c5b17a07-5a25-4d01-80fb-bdbc8547cda7-kube-api-access-fwtw8\") pod \"watcher-operator-controller-manager-7cc8dbcb54-tckbd\" (UID: \"c5b17a07-5a25-4d01-80fb-bdbc8547cda7\") " pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.166578 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.166657 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert podName:e43b6f19-e463-43ce-9efe-5cefa3b53682 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:10.1666137 +0000 UTC m=+1055.679305968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert") pod "infra-operator-controller-manager-fbfb5bd65-v7cqm" (UID: "e43b6f19-e463-43ce-9efe-5cefa3b53682") : secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.177842 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.198411 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.200254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x2sw\" (UniqueName: \"kubernetes.io/projected/f1b524ae-60fa-48a7-aa07-bf354bd2ff62-kube-api-access-8x2sw\") pod \"test-operator-controller-manager-8467ccb4c8-r9dtz\" (UID: \"f1b524ae-60fa-48a7-aa07-bf354bd2ff62\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.220296 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxsj\" (UniqueName: \"kubernetes.io/projected/2c1f44c6-aae3-4c3c-933e-c87956fb0fe6-kube-api-access-kmxsj\") pod \"telemetry-operator-controller-manager-6646df7cdb-tspwx\" (UID: \"2c1f44c6-aae3-4c3c-933e-c87956fb0fe6\") " pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.245657 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.266800 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-hp29s"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.271897 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqtp8\" (UniqueName: \"kubernetes.io/projected/3d61bf91-4992-47ad-8a53-e823a71d3f9c-kube-api-access-jqtp8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vppb7\" (UID: \"3d61bf91-4992-47ad-8a53-e823a71d3f9c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.271981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwtw8\" (UniqueName: \"kubernetes.io/projected/c5b17a07-5a25-4d01-80fb-bdbc8547cda7-kube-api-access-fwtw8\") pod \"watcher-operator-controller-manager-7cc8dbcb54-tckbd\" (UID: \"c5b17a07-5a25-4d01-80fb-bdbc8547cda7\") " pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.272046 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.272088 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jvk\" (UniqueName: \"kubernetes.io/projected/6bff6422-f245-457e-9ddb-28f957c9edac-kube-api-access-f2jvk\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.272125 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.273075 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.273203 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:09.77318053 +0000 UTC m=+1055.285872798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.274144 4956 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.274235 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:09.774211526 +0000 UTC m=+1055.286903864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "metrics-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.283320 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-hp29s"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.296727 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwtw8\" (UniqueName: \"kubernetes.io/projected/c5b17a07-5a25-4d01-80fb-bdbc8547cda7-kube-api-access-fwtw8\") pod \"watcher-operator-controller-manager-7cc8dbcb54-tckbd\" (UID: \"c5b17a07-5a25-4d01-80fb-bdbc8547cda7\") " pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.299870 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jvk\" (UniqueName: \"kubernetes.io/projected/6bff6422-f245-457e-9ddb-28f957c9edac-kube-api-access-f2jvk\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.374674 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqtp8\" (UniqueName: \"kubernetes.io/projected/3d61bf91-4992-47ad-8a53-e823a71d3f9c-kube-api-access-jqtp8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vppb7\" (UID: \"3d61bf91-4992-47ad-8a53-e823a71d3f9c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.398768 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.399435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.416744 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqtp8\" (UniqueName: \"kubernetes.io/projected/3d61bf91-4992-47ad-8a53-e823a71d3f9c-kube-api-access-jqtp8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vppb7\" (UID: \"3d61bf91-4992-47ad-8a53-e823a71d3f9c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.435585 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.472574 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.475620 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.476272 4956 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.476324 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert podName:67182d33-3abc-4661-8614-94238efc9e45 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:10.476307673 +0000 UTC m=+1055.988999941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" (UID: "67182d33-3abc-4661-8614-94238efc9e45") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.784324 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.784377 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.784518 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.784570 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:10.784553509 +0000 UTC m=+1056.297245777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "webhook-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.784934 4956 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: E0314 09:14:09.784964 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:10.784956329 +0000 UTC m=+1056.297648597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "metrics-server-cert" not found Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.923341 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp"] Mar 14 09:14:09 crc kubenswrapper[4956]: I0314 09:14:09.942306 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.200312 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.200444 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.200507 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert podName:e43b6f19-e463-43ce-9efe-5cefa3b53682 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:12.200476204 +0000 UTC m=+1057.713168472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert") pod "infra-operator-controller-manager-fbfb5bd65-v7cqm" (UID: "e43b6f19-e463-43ce-9efe-5cefa3b53682") : secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.353223 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8"] Mar 14 09:14:10 crc kubenswrapper[4956]: W0314 09:14:10.383035 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod876c14aa_a86a_495f_a110_3ade7d8d69fb.slice/crio-013e6cf423a0ae2a71c61279665690928b9e4ccfcf4a8edd3ed38f894c44a268 WatchSource:0}: Error finding container 013e6cf423a0ae2a71c61279665690928b9e4ccfcf4a8edd3ed38f894c44a268: Status 404 returned error can't find the container with id 013e6cf423a0ae2a71c61279665690928b9e4ccfcf4a8edd3ed38f894c44a268 Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.414846 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.422068 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.429236 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5"] Mar 14 09:14:10 crc kubenswrapper[4956]: W0314 09:14:10.433641 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4492ee98_efe4_49c3_8c14_86453a8e8714.slice/crio-844396704f7682e138bcebf61f636ec017f0e13aadeb3c2aaa92317747227343 WatchSource:0}: Error finding container 844396704f7682e138bcebf61f636ec017f0e13aadeb3c2aaa92317747227343: Status 404 returned error can't find the container with id 844396704f7682e138bcebf61f636ec017f0e13aadeb3c2aaa92317747227343 Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.476481 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s"] Mar 14 09:14:10 crc kubenswrapper[4956]: W0314 09:14:10.484748 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401a3c6d_db2a_435b_b7f5_08816736d895.slice/crio-b754fb30d88eaca49e3e1c635f5f6973494d9348a73cd3de30c8bed10517c05a WatchSource:0}: Error finding container b754fb30d88eaca49e3e1c635f5f6973494d9348a73cd3de30c8bed10517c05a: Status 404 returned error can't find the container with id b754fb30d88eaca49e3e1c635f5f6973494d9348a73cd3de30c8bed10517c05a Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.506160 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.506332 4956 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.506384 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert podName:67182d33-3abc-4661-8614-94238efc9e45 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:12.506367312 +0000 UTC m=+1058.019059580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" (UID: "67182d33-3abc-4661-8614-94238efc9e45") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.801923 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx"] Mar 14 09:14:10 crc kubenswrapper[4956]: W0314 09:14:10.803388 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1f44c6_aae3_4c3c_933e_c87956fb0fe6.slice/crio-d2da34b59709f88354a455da56a74a92a9256d949288d6fdce248cdeba7bc6da WatchSource:0}: Error finding container d2da34b59709f88354a455da56a74a92a9256d949288d6fdce248cdeba7bc6da: Status 404 returned error can't find the container with id d2da34b59709f88354a455da56a74a92a9256d949288d6fdce248cdeba7bc6da Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.808029 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm"] Mar 14 09:14:10 crc kubenswrapper[4956]: W0314 09:14:10.814649 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3749fc9_e22e_42b9_8865_68679f7d78f1.slice/crio-54584ddb7f08ca1efb2b03a316c965d663e3e5a593acac79c358ce8d40ab3ba8 WatchSource:0}: Error finding container 54584ddb7f08ca1efb2b03a316c965d663e3e5a593acac79c358ce8d40ab3ba8: Status 404 returned error can't find the container with id 54584ddb7f08ca1efb2b03a316c965d663e3e5a593acac79c358ce8d40ab3ba8 Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.817527 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.817660 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.817700 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.817818 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.817857 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:12.817843129 +0000 UTC m=+1058.330535397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "webhook-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.817897 4956 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.817915 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:12.817909211 +0000 UTC m=+1058.330601479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "metrics-server-cert" not found Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.831301 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.842810 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.854187 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.866394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.872864 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" event={"ID":"2c1f44c6-aae3-4c3c-933e-c87956fb0fe6","Type":"ContainerStarted","Data":"d2da34b59709f88354a455da56a74a92a9256d949288d6fdce248cdeba7bc6da"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.874611 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" event={"ID":"401a3c6d-db2a-435b-b7f5-08816736d895","Type":"ContainerStarted","Data":"b754fb30d88eaca49e3e1c635f5f6973494d9348a73cd3de30c8bed10517c05a"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.875892 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.877336 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" event={"ID":"b3749fc9-e22e-42b9-8865-68679f7d78f1","Type":"ContainerStarted","Data":"54584ddb7f08ca1efb2b03a316c965d663e3e5a593acac79c358ce8d40ab3ba8"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.878895 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" event={"ID":"876c14aa-a86a-495f-a110-3ade7d8d69fb","Type":"ContainerStarted","Data":"013e6cf423a0ae2a71c61279665690928b9e4ccfcf4a8edd3ed38f894c44a268"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.880132 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" event={"ID":"4492ee98-efe4-49c3-8c14-86453a8e8714","Type":"ContainerStarted","Data":"844396704f7682e138bcebf61f636ec017f0e13aadeb3c2aaa92317747227343"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.882724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" event={"ID":"03bd3900-f5fa-476e-a91a-f492e4a424dc","Type":"ContainerStarted","Data":"6975715f897e0fb30d096e3c4b26ff484410058033598253d050d573c631d7a5"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.883509 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" event={"ID":"18ad86e0-d070-4de1-bd50-a93f9abdf715","Type":"ContainerStarted","Data":"35547091b18c05090764fa461b6e6aac4390e988da591973d94fe12303b6e62c"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.884275 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" event={"ID":"9bf45de7-ba46-4ce9-a7d7-fc26e253423b","Type":"ContainerStarted","Data":"83fcc62f447958a16a8df3ae81f498200824ba866b14adcafadcd4b4adf82ddc"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.885154 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" event={"ID":"240f1b23-5499-4644-a442-9647e71a33d4","Type":"ContainerStarted","Data":"38413be4d9061fd833de92f022f9d810a8340d81daf396f68d6bee1704680fea"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.886285 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd"] Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.886619 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" event={"ID":"6d1aed1b-6436-46ca-a824-59eafb8ca5d3","Type":"ContainerStarted","Data":"149b67767b6bd01988328723d4244447c083ced51a3d2333816c55b418700d3c"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.888427 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" event={"ID":"18021394-f27d-422e-a68c-24a19d74ceb8","Type":"ContainerStarted","Data":"5460c56c7c2f7060241dd1c37b631eba30b004640bf42f2e1954a45f7743e024"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.891159 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" event={"ID":"19ee7ede-7bda-46bc-8413-95262fa53969","Type":"ContainerStarted","Data":"95e9f4535910e43c908b1ad4c208b596759fc79ef5dd3fb16662bb6864102948"} Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.895755 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz"] Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.907710 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:25b9550f0738285c05af02dda06d4ed9edb64e8200cd487dd8af29dea7717278,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ljv95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f7469dbc6-6djhw_openstack-operators(24e7f369-d268-4aa7-89f9-1b4ce48fd197): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.907766 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x2sw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-r9dtz_openstack-operators(f1b524ae-60fa-48a7-aa07-bf354bd2ff62): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.908029 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:3123c700e2ee2d124d8532e1324a469a61963beafe141dd1f3be8799ba1f1c46,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x8rzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-848d74f969-6n97h_openstack-operators(a8d7db47-dd9b-4785-8e80-d7d97a324225): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.908506 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h"] Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.909033 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" podUID="24e7f369-d268-4aa7-89f9-1b4ce48fd197" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.909059 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" podUID="f1b524ae-60fa-48a7-aa07-bf354bd2ff62" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.909090 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" podUID="a8d7db47-dd9b-4785-8e80-d7d97a324225" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.912697 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fwtw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7cc8dbcb54-tckbd_openstack-operators(c5b17a07-5a25-4d01-80fb-bdbc8547cda7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.914350 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.915076 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7"] Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.923821 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqtp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vppb7_openstack-operators(3d61bf91-4992-47ad-8a53-e823a71d3f9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.925008 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" podUID="3d61bf91-4992-47ad-8a53-e823a71d3f9c" Mar 14 09:14:10 crc kubenswrapper[4956]: I0314 09:14:10.927833 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq"] Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.936591 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:83542545b268e8ca5b0f034d5f3d264545e88d86b327648f9568b3e95f105a01,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j7h9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-b5c469fd-bffxq_openstack-operators(bad25fda-3055-4fa2-8fd4-24980a88c7c6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 09:14:10 crc kubenswrapper[4956]: E0314 09:14:10.938543 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" podUID="bad25fda-3055-4fa2-8fd4-24980a88c7c6" Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.220686 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af" path="/var/lib/kubelet/pods/ccd0bf5c-c5f8-4c12-a2dc-e2d0791b08af/volumes" Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.905845 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" event={"ID":"c5b17a07-5a25-4d01-80fb-bdbc8547cda7","Type":"ContainerStarted","Data":"caae42fb64ee35e863ba7204f959bc9549152726a1d6bc1f6f079dd4c6aa2e71"} Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.907475 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" event={"ID":"bad25fda-3055-4fa2-8fd4-24980a88c7c6","Type":"ContainerStarted","Data":"454ccf87c23137cb841a2073ee90f331c80233d76cb7f139b5f30f2b8a188071"} Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.909182 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" event={"ID":"e0b8fc0f-4ffa-4c26-84e4-9613f5161286","Type":"ContainerStarted","Data":"434617ab2e94dfcab37a4ac2d50ecd1b648caf18e9f5fb9d5a0f1dd90f421d8e"} Mar 14 09:14:11 crc kubenswrapper[4956]: E0314 09:14:11.911738 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" Mar 14 09:14:11 crc kubenswrapper[4956]: E0314 09:14:11.912440 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:83542545b268e8ca5b0f034d5f3d264545e88d86b327648f9568b3e95f105a01\\\"\"" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" podUID="bad25fda-3055-4fa2-8fd4-24980a88c7c6" Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.913317 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" event={"ID":"24e7f369-d268-4aa7-89f9-1b4ce48fd197","Type":"ContainerStarted","Data":"6310f828ccc05429b100aae7ccff2b6bfcbcbe9b3d9004f893552c326e749b02"} Mar 14 09:14:11 crc kubenswrapper[4956]: E0314 09:14:11.915359 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:25b9550f0738285c05af02dda06d4ed9edb64e8200cd487dd8af29dea7717278\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" podUID="24e7f369-d268-4aa7-89f9-1b4ce48fd197" Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.915803 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" event={"ID":"fdc31f77-84df-4657-b5ec-a7fcd8b673e2","Type":"ContainerStarted","Data":"06ce70a62c75c849d8173603f60f7b3e2c29b9ebddf9ba2144992e0bc58f3466"} Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.916883 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" event={"ID":"f1b524ae-60fa-48a7-aa07-bf354bd2ff62","Type":"ContainerStarted","Data":"6e5253a5d07295e65087a715af7fee23035f7d573020bdcca9ec11d4753039e9"} Mar 14 09:14:11 crc kubenswrapper[4956]: E0314 09:14:11.918912 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" podUID="f1b524ae-60fa-48a7-aa07-bf354bd2ff62" Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.919586 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" event={"ID":"3d61bf91-4992-47ad-8a53-e823a71d3f9c","Type":"ContainerStarted","Data":"d6f361157f866dea82910628b23427152756d51638622822552b1aa90733cef1"} Mar 14 09:14:11 crc kubenswrapper[4956]: E0314 09:14:11.920922 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" podUID="3d61bf91-4992-47ad-8a53-e823a71d3f9c" Mar 14 09:14:11 crc kubenswrapper[4956]: I0314 09:14:11.923378 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" event={"ID":"a8d7db47-dd9b-4785-8e80-d7d97a324225","Type":"ContainerStarted","Data":"1d450fdb48eab426323612756e1f008b91b00e50431f62084f66d099c009faca"} Mar 14 09:14:11 crc kubenswrapper[4956]: E0314 09:14:11.928388 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:3123c700e2ee2d124d8532e1324a469a61963beafe141dd1f3be8799ba1f1c46\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" podUID="a8d7db47-dd9b-4785-8e80-d7d97a324225" Mar 14 09:14:12 crc kubenswrapper[4956]: I0314 09:14:12.249877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.250041 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.250123 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert podName:e43b6f19-e463-43ce-9efe-5cefa3b53682 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:16.250104411 +0000 UTC m=+1061.762796679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert") pod "infra-operator-controller-manager-fbfb5bd65-v7cqm" (UID: "e43b6f19-e463-43ce-9efe-5cefa3b53682") : secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: I0314 09:14:12.554161 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.554359 4956 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.554446 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert podName:67182d33-3abc-4661-8614-94238efc9e45 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:16.554428251 +0000 UTC m=+1062.067120519 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" (UID: "67182d33-3abc-4661-8614-94238efc9e45") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: I0314 09:14:12.858863 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:12 crc kubenswrapper[4956]: I0314 09:14:12.858918 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.859038 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.859085 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:16.859071728 +0000 UTC m=+1062.371763996 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "webhook-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.859038 4956 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.859156 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:16.85914582 +0000 UTC m=+1062.371838088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "metrics-server-cert" not found Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.937277 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:83542545b268e8ca5b0f034d5f3d264545e88d86b327648f9568b3e95f105a01\\\"\"" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" podUID="bad25fda-3055-4fa2-8fd4-24980a88c7c6" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.937345 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:25b9550f0738285c05af02dda06d4ed9edb64e8200cd487dd8af29dea7717278\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" podUID="24e7f369-d268-4aa7-89f9-1b4ce48fd197" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.937424 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" podUID="3d61bf91-4992-47ad-8a53-e823a71d3f9c" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.937459 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.937805 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" podUID="f1b524ae-60fa-48a7-aa07-bf354bd2ff62" Mar 14 09:14:12 crc kubenswrapper[4956]: E0314 09:14:12.938205 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:3123c700e2ee2d124d8532e1324a469a61963beafe141dd1f3be8799ba1f1c46\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" podUID="a8d7db47-dd9b-4785-8e80-d7d97a324225" Mar 14 09:14:16 crc kubenswrapper[4956]: I0314 09:14:16.322655 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.322890 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.323838 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert podName:e43b6f19-e463-43ce-9efe-5cefa3b53682 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:24.323814352 +0000 UTC m=+1069.836506620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert") pod "infra-operator-controller-manager-fbfb5bd65-v7cqm" (UID: "e43b6f19-e463-43ce-9efe-5cefa3b53682") : secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: I0314 09:14:16.628274 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.628569 4956 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.628666 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert podName:67182d33-3abc-4661-8614-94238efc9e45 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:24.628649214 +0000 UTC m=+1070.141341472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" (UID: "67182d33-3abc-4661-8614-94238efc9e45") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: I0314 09:14:16.932376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:16 crc kubenswrapper[4956]: I0314 09:14:16.932524 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.932608 4956 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.932685 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:24.932667885 +0000 UTC m=+1070.445360153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "metrics-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.932616 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 09:14:16 crc kubenswrapper[4956]: E0314 09:14:16.932788 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:24.932755207 +0000 UTC m=+1070.445447475 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "webhook-server-cert" not found Mar 14 09:14:23 crc kubenswrapper[4956]: E0314 09:14:23.882204 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:b15bc78181df64e701e7dd6fd70f6c26c2cbb20c2a9e3b1180a635b791d586bf" Mar 14 09:14:23 crc kubenswrapper[4956]: E0314 09:14:23.883016 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:b15bc78181df64e701e7dd6fd70f6c26c2cbb20c2a9e3b1180a635b791d586bf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrtz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9475cdd7-586s5_openstack-operators(4492ee98-efe4-49c3-8c14-86453a8e8714): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:23 crc kubenswrapper[4956]: E0314 09:14:23.884196 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" podUID="4492ee98-efe4-49c3-8c14-86453a8e8714" Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.067647 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:b15bc78181df64e701e7dd6fd70f6c26c2cbb20c2a9e3b1180a635b791d586bf\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" podUID="4492ee98-efe4-49c3-8c14-86453a8e8714" Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.346855 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.347120 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.347356 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert podName:e43b6f19-e463-43ce-9efe-5cefa3b53682 nodeName:}" failed. No retries permitted until 2026-03-14 09:14:40.34730264 +0000 UTC m=+1085.859994958 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert") pod "infra-operator-controller-manager-fbfb5bd65-v7cqm" (UID: "e43b6f19-e463-43ce-9efe-5cefa3b53682") : secret "infra-operator-webhook-server-cert" not found Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.608240 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:374167613b6fe07cac0b627f6b5f1fa9aedc132504ef68ce66d11f1a9e7ac816" Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.608442 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:374167613b6fe07cac0b627f6b5f1fa9aedc132504ef68ce66d11f1a9e7ac816,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxwkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-6d6bd468b-db2v8_openstack-operators(876c14aa-a86a-495f-a110-3ade7d8d69fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.609718 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" podUID="876c14aa-a86a-495f-a110-3ade7d8d69fb" Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.653264 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.662072 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67182d33-3abc-4661-8614-94238efc9e45-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph\" (UID: \"67182d33-3abc-4661-8614-94238efc9e45\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.676203 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.956500 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.956568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.956670 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 09:14:24 crc kubenswrapper[4956]: E0314 09:14:24.956733 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs podName:6bff6422-f245-457e-9ddb-28f957c9edac nodeName:}" failed. No retries permitted until 2026-03-14 09:14:40.956717668 +0000 UTC m=+1086.469409936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs") pod "openstack-operator-controller-manager-59b5586c67-c9k5q" (UID: "6bff6422-f245-457e-9ddb-28f957c9edac") : secret "webhook-server-cert" not found Mar 14 09:14:24 crc kubenswrapper[4956]: I0314 09:14:24.959937 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-metrics-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:25 crc kubenswrapper[4956]: E0314 09:14:25.095911 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:374167613b6fe07cac0b627f6b5f1fa9aedc132504ef68ce66d11f1a9e7ac816\\\"\"" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" podUID="876c14aa-a86a-495f-a110-3ade7d8d69fb" Mar 14 09:14:29 crc kubenswrapper[4956]: E0314 09:14:29.869313 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:a70ca136f44c6e6a2019ef73a813bdb97b2f7901a71f88591f3845750a554f88" Mar 14 09:14:29 crc kubenswrapper[4956]: E0314 09:14:29.870026 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:a70ca136f44c6e6a2019ef73a813bdb97b2f7901a71f88591f3845750a554f88,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m2x9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-68f8d496f8-4knlv_openstack-operators(18ad86e0-d070-4de1-bd50-a93f9abdf715): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:29 crc kubenswrapper[4956]: E0314 09:14:29.871267 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" podUID="18ad86e0-d070-4de1-bd50-a93f9abdf715" Mar 14 09:14:30 crc kubenswrapper[4956]: E0314 09:14:30.135850 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:a70ca136f44c6e6a2019ef73a813bdb97b2f7901a71f88591f3845750a554f88\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" podUID="18ad86e0-d070-4de1-bd50-a93f9abdf715" Mar 14 09:14:30 crc kubenswrapper[4956]: E0314 09:14:30.785632 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:ff20b84a172c2bdeaab0111915b0d1ba99370534ebd720d6daf63153a7d7d59e" Mar 14 09:14:30 crc kubenswrapper[4956]: E0314 09:14:30.786089 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:ff20b84a172c2bdeaab0111915b0d1ba99370534ebd720d6daf63153a7d7d59e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6d9ll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-58ff56fcc7-9kjv4_openstack-operators(fdc31f77-84df-4657-b5ec-a7fcd8b673e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:30 crc kubenswrapper[4956]: E0314 09:14:30.787312 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" podUID="fdc31f77-84df-4657-b5ec-a7fcd8b673e2" Mar 14 09:14:31 crc kubenswrapper[4956]: E0314 09:14:31.143379 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:ff20b84a172c2bdeaab0111915b0d1ba99370534ebd720d6daf63153a7d7d59e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" podUID="fdc31f77-84df-4657-b5ec-a7fcd8b673e2" Mar 14 09:14:40 crc kubenswrapper[4956]: I0314 09:14:40.399445 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:40 crc kubenswrapper[4956]: I0314 09:14:40.405157 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e43b6f19-e463-43ce-9efe-5cefa3b53682-cert\") pod \"infra-operator-controller-manager-fbfb5bd65-v7cqm\" (UID: \"e43b6f19-e463-43ce-9efe-5cefa3b53682\") " pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:40 crc kubenswrapper[4956]: I0314 09:14:40.573902 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-sp7ht" Mar 14 09:14:40 crc kubenswrapper[4956]: I0314 09:14:40.582748 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:41 crc kubenswrapper[4956]: I0314 09:14:41.007630 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:41 crc kubenswrapper[4956]: I0314 09:14:41.011099 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6bff6422-f245-457e-9ddb-28f957c9edac-webhook-certs\") pod \"openstack-operator-controller-manager-59b5586c67-c9k5q\" (UID: \"6bff6422-f245-457e-9ddb-28f957c9edac\") " pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:41 crc kubenswrapper[4956]: I0314 09:14:41.234026 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wwlv9" Mar 14 09:14:41 crc kubenswrapper[4956]: I0314 09:14:41.242580 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:41 crc kubenswrapper[4956]: E0314 09:14:41.482302 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27" Mar 14 09:14:41 crc kubenswrapper[4956]: E0314 09:14:41.482523 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x2sw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-r9dtz_openstack-operators(f1b524ae-60fa-48a7-aa07-bf354bd2ff62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:41 crc kubenswrapper[4956]: E0314 09:14:41.483862 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" podUID="f1b524ae-60fa-48a7-aa07-bf354bd2ff62" Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.176851 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.177336 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqtp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vppb7_openstack-operators(3d61bf91-4992-47ad-8a53-e823a71d3f9c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.178692 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" podUID="3d61bf91-4992-47ad-8a53-e823a71d3f9c" Mar 14 09:14:44 crc kubenswrapper[4956]: I0314 09:14:44.633682 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph"] Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.720401 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3" Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.720451 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3" Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.720632 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fwtw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7cc8dbcb54-tckbd_openstack-operators(c5b17a07-5a25-4d01-80fb-bdbc8547cda7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:14:44 crc kubenswrapper[4956]: E0314 09:14:44.721786 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" Mar 14 09:14:44 crc kubenswrapper[4956]: W0314 09:14:44.920978 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67182d33_3abc_4661_8614_94238efc9e45.slice/crio-648e909a3c588923257216dac6c7ccc8a9d29953649e34ff8c15cbf29e8ae9a1 WatchSource:0}: Error finding container 648e909a3c588923257216dac6c7ccc8a9d29953649e34ff8c15cbf29e8ae9a1: Status 404 returned error can't find the container with id 648e909a3c588923257216dac6c7ccc8a9d29953649e34ff8c15cbf29e8ae9a1 Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.277614 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" event={"ID":"6d1aed1b-6436-46ca-a824-59eafb8ca5d3","Type":"ContainerStarted","Data":"000ddc0de5e4a26b86643eb3dfe863b59aa74f30120c857b91380656fce438bc"} Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.277870 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.288293 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" event={"ID":"18021394-f27d-422e-a68c-24a19d74ceb8","Type":"ContainerStarted","Data":"0b30b499243dde33536bd21c9b98dbc47bdc76f2c3adadd2ba052cf039268b35"} Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.288970 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.299121 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" podStartSLOduration=20.413660269 podStartE2EDuration="37.299108519s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.021152864 +0000 UTC m=+1055.533845132" lastFinishedPulling="2026-03-14 09:14:26.906601114 +0000 UTC m=+1072.419293382" observedRunningTime="2026-03-14 09:14:45.297723454 +0000 UTC m=+1090.810415722" watchObservedRunningTime="2026-03-14 09:14:45.299108519 +0000 UTC m=+1090.811800787" Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.305267 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" event={"ID":"9bf45de7-ba46-4ce9-a7d7-fc26e253423b","Type":"ContainerStarted","Data":"8a412bce8badddee6ade78f775bdcd04ecfb39e547c103d35b7631fc2860d102"} Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.305327 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.311223 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" event={"ID":"67182d33-3abc-4661-8614-94238efc9e45","Type":"ContainerStarted","Data":"648e909a3c588923257216dac6c7ccc8a9d29953649e34ff8c15cbf29e8ae9a1"} Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.328590 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" podStartSLOduration=20.851841848 podStartE2EDuration="37.328572052s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.429958612 +0000 UTC m=+1055.942650880" lastFinishedPulling="2026-03-14 09:14:26.906688816 +0000 UTC m=+1072.419381084" observedRunningTime="2026-03-14 09:14:45.316290826 +0000 UTC m=+1090.828983094" watchObservedRunningTime="2026-03-14 09:14:45.328572052 +0000 UTC m=+1090.841264320" Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.346203 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q"] Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.352867 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" podStartSLOduration=22.784105315 podStartE2EDuration="37.352840105s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.024678812 +0000 UTC m=+1055.537371080" lastFinishedPulling="2026-03-14 09:14:24.593413602 +0000 UTC m=+1070.106105870" observedRunningTime="2026-03-14 09:14:45.335390691 +0000 UTC m=+1090.848082959" watchObservedRunningTime="2026-03-14 09:14:45.352840105 +0000 UTC m=+1090.865532373" Mar 14 09:14:45 crc kubenswrapper[4956]: W0314 09:14:45.353588 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bff6422_f245_457e_9ddb_28f957c9edac.slice/crio-7ecc764d4bab55cda7cf922f4a89793f3377312ae9188e0e69d331415be3baf5 WatchSource:0}: Error finding container 7ecc764d4bab55cda7cf922f4a89793f3377312ae9188e0e69d331415be3baf5: Status 404 returned error can't find the container with id 7ecc764d4bab55cda7cf922f4a89793f3377312ae9188e0e69d331415be3baf5 Mar 14 09:14:45 crc kubenswrapper[4956]: I0314 09:14:45.434387 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm"] Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.334642 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" event={"ID":"a8d7db47-dd9b-4785-8e80-d7d97a324225","Type":"ContainerStarted","Data":"2cdbc52b761ac27f1ea05e679632d6e825cac9d96b35307189b2d7b2d431b03e"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.335513 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.352684 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" event={"ID":"18ad86e0-d070-4de1-bd50-a93f9abdf715","Type":"ContainerStarted","Data":"b01e382f2f94dcd306ebf947029eb0c873fd627966d77c386ad023202208fb40"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.353297 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.362761 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" event={"ID":"2c1f44c6-aae3-4c3c-933e-c87956fb0fe6","Type":"ContainerStarted","Data":"e32736e7a8251ba5fd4a1d0906e7eebd1f8d9880ee8b308c08c0088634f8ce42"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.363554 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.369166 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" event={"ID":"24e7f369-d268-4aa7-89f9-1b4ce48fd197","Type":"ContainerStarted","Data":"00c2445edd3b53f6cceeebe245cc2683d40811890ac6551a69525d27975fdf4d"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.369935 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.382737 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" event={"ID":"b3749fc9-e22e-42b9-8865-68679f7d78f1","Type":"ContainerStarted","Data":"43c07c3777ff33b9f955595f86ca30f2136126046b5ade568401c6ba16349311"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.383512 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.411570 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" event={"ID":"4492ee98-efe4-49c3-8c14-86453a8e8714","Type":"ContainerStarted","Data":"432cb79eceb0255e73a99165af1872bba9ec0fcca620e502760dd5d710672fba"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.412797 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.433426 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" podStartSLOduration=4.400249497 podStartE2EDuration="38.433401561s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.90754315 +0000 UTC m=+1056.420235418" lastFinishedPulling="2026-03-14 09:14:44.940695194 +0000 UTC m=+1090.453387482" observedRunningTime="2026-03-14 09:14:46.380740841 +0000 UTC m=+1091.893433119" watchObservedRunningTime="2026-03-14 09:14:46.433401561 +0000 UTC m=+1091.946093829" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.433719 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" podStartSLOduration=3.750224029 podStartE2EDuration="38.433713299s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.426128956 +0000 UTC m=+1055.938821224" lastFinishedPulling="2026-03-14 09:14:45.109618226 +0000 UTC m=+1090.622310494" observedRunningTime="2026-03-14 09:14:46.429869643 +0000 UTC m=+1091.942561911" watchObservedRunningTime="2026-03-14 09:14:46.433713299 +0000 UTC m=+1091.946405567" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.441957 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" event={"ID":"876c14aa-a86a-495f-a110-3ade7d8d69fb","Type":"ContainerStarted","Data":"9f354abdfec316b6cf23505a8c7d682a044fffd83eb393d235c0324a12a136e0"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.442438 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.461085 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" event={"ID":"e43b6f19-e463-43ce-9efe-5cefa3b53682","Type":"ContainerStarted","Data":"ce63e26aec0f625ec6d49def73cdfe69d78a7eb66d72e138d52b9a7586d9cb6f"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.465345 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" podStartSLOduration=18.105899641 podStartE2EDuration="38.465327335s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.805604355 +0000 UTC m=+1056.318296623" lastFinishedPulling="2026-03-14 09:14:31.165032049 +0000 UTC m=+1076.677724317" observedRunningTime="2026-03-14 09:14:46.463242573 +0000 UTC m=+1091.975934841" watchObservedRunningTime="2026-03-14 09:14:46.465327335 +0000 UTC m=+1091.978019593" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.493501 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" event={"ID":"19ee7ede-7bda-46bc-8413-95262fa53969","Type":"ContainerStarted","Data":"e4d4cc0640c6583d0f0dfc455671b26f34cef9be9396902fbf06bb859ede451a"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.496369 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.514402 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" event={"ID":"6bff6422-f245-457e-9ddb-28f957c9edac","Type":"ContainerStarted","Data":"4ce3cb3693710f385f9cde86ae53c5edc3d4f565492eb9c399c7f1aeb4a80102"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.514459 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" event={"ID":"6bff6422-f245-457e-9ddb-28f957c9edac","Type":"ContainerStarted","Data":"7ecc764d4bab55cda7cf922f4a89793f3377312ae9188e0e69d331415be3baf5"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.515562 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.553423 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" event={"ID":"401a3c6d-db2a-435b-b7f5-08816736d895","Type":"ContainerStarted","Data":"9944310fe33b727b79448f2bb2225a3648b6afff560345b2f632ca4f5bd4f12e"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.554103 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.589980 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" event={"ID":"03bd3900-f5fa-476e-a91a-f492e4a424dc","Type":"ContainerStarted","Data":"d4c06c0a3b664875c2ca8a46b7d3c8c832634507c4a7b394f582b94b3bd175b1"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.591540 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.611255 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" podStartSLOduration=5.34239387 podStartE2EDuration="38.611231544s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.9075515 +0000 UTC m=+1056.420243768" lastFinishedPulling="2026-03-14 09:14:44.176389174 +0000 UTC m=+1089.689081442" observedRunningTime="2026-03-14 09:14:46.602678551 +0000 UTC m=+1092.115370809" watchObservedRunningTime="2026-03-14 09:14:46.611231544 +0000 UTC m=+1092.123923812" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.623692 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" event={"ID":"bad25fda-3055-4fa2-8fd4-24980a88c7c6","Type":"ContainerStarted","Data":"141406cad2e9cdf0ea4435e72079b5b2f634f21c1fde9b3732d05d1c0e8d149c"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.624437 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.636662 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" event={"ID":"e0b8fc0f-4ffa-4c26-84e4-9613f5161286","Type":"ContainerStarted","Data":"725bad6ad180bb56ca03b08760bac2acad435552d4746a8f842e26fb9f1e320c"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.637774 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.679745 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" event={"ID":"240f1b23-5499-4644-a442-9647e71a33d4","Type":"ContainerStarted","Data":"68f9f67034069a4a1bbf4b0ab47c04ba3cea412efc5a49c3b247fd9249cc1772"} Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.679789 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.704312 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" podStartSLOduration=4.077314575 podStartE2EDuration="38.704295549s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.435572931 +0000 UTC m=+1055.948265199" lastFinishedPulling="2026-03-14 09:14:45.062553885 +0000 UTC m=+1090.575246173" observedRunningTime="2026-03-14 09:14:46.698148346 +0000 UTC m=+1092.210840614" watchObservedRunningTime="2026-03-14 09:14:46.704295549 +0000 UTC m=+1092.216987817" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.756542 4956 scope.go:117] "RemoveContainer" containerID="1ed45b2cc45b07140a5e8d2eb88b089bbaf5997b0fc8775aa61f71a8adaa2d1d" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.868671 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" podStartSLOduration=37.868644546 podStartE2EDuration="37.868644546s" podCreationTimestamp="2026-03-14 09:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:14:46.824871738 +0000 UTC m=+1092.337564006" watchObservedRunningTime="2026-03-14 09:14:46.868644546 +0000 UTC m=+1092.381336814" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.883598 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" podStartSLOduration=9.165366955 podStartE2EDuration="38.883567208s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.840587235 +0000 UTC m=+1056.353279503" lastFinishedPulling="2026-03-14 09:14:40.558787498 +0000 UTC m=+1086.071479756" observedRunningTime="2026-03-14 09:14:46.864944004 +0000 UTC m=+1092.377636272" watchObservedRunningTime="2026-03-14 09:14:46.883567208 +0000 UTC m=+1092.396259476" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.894002 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" podStartSLOduration=19.932184365 podStartE2EDuration="38.893985707s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.877532724 +0000 UTC m=+1056.390224992" lastFinishedPulling="2026-03-14 09:14:29.839334056 +0000 UTC m=+1075.352026334" observedRunningTime="2026-03-14 09:14:46.882111181 +0000 UTC m=+1092.394803449" watchObservedRunningTime="2026-03-14 09:14:46.893985707 +0000 UTC m=+1092.406677975" Mar 14 09:14:46 crc kubenswrapper[4956]: I0314 09:14:46.959720 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" podStartSLOduration=18.613155167 podStartE2EDuration="38.959702241s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.81668771 +0000 UTC m=+1056.329379978" lastFinishedPulling="2026-03-14 09:14:31.163234784 +0000 UTC m=+1076.675927052" observedRunningTime="2026-03-14 09:14:46.954072821 +0000 UTC m=+1092.466765089" watchObservedRunningTime="2026-03-14 09:14:46.959702241 +0000 UTC m=+1092.472394509" Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.012983 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" podStartSLOduration=4.261339252 podStartE2EDuration="39.012953026s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.39010499 +0000 UTC m=+1055.902797258" lastFinishedPulling="2026-03-14 09:14:45.141718764 +0000 UTC m=+1090.654411032" observedRunningTime="2026-03-14 09:14:46.99704956 +0000 UTC m=+1092.509741828" watchObservedRunningTime="2026-03-14 09:14:47.012953026 +0000 UTC m=+1092.525645294" Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.032366 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" podStartSLOduration=18.985416076 podStartE2EDuration="39.032343458s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.487549214 +0000 UTC m=+1056.000241482" lastFinishedPulling="2026-03-14 09:14:30.534476596 +0000 UTC m=+1076.047168864" observedRunningTime="2026-03-14 09:14:47.022075683 +0000 UTC m=+1092.534767951" watchObservedRunningTime="2026-03-14 09:14:47.032343458 +0000 UTC m=+1092.545035726" Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.100259 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" podStartSLOduration=17.669210089 podStartE2EDuration="39.100230956s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.820212488 +0000 UTC m=+1056.332904756" lastFinishedPulling="2026-03-14 09:14:32.251233365 +0000 UTC m=+1077.763925623" observedRunningTime="2026-03-14 09:14:47.093260833 +0000 UTC m=+1092.605953111" watchObservedRunningTime="2026-03-14 09:14:47.100230956 +0000 UTC m=+1092.612923224" Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.108636 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" podStartSLOduration=5.102841072 podStartE2EDuration="39.108605075s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.936305726 +0000 UTC m=+1056.448997994" lastFinishedPulling="2026-03-14 09:14:44.942069709 +0000 UTC m=+1090.454761997" observedRunningTime="2026-03-14 09:14:47.059876173 +0000 UTC m=+1092.572568451" watchObservedRunningTime="2026-03-14 09:14:47.108605075 +0000 UTC m=+1092.621297333" Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.125830 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" podStartSLOduration=9.457711666 podStartE2EDuration="39.125810823s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.890715642 +0000 UTC m=+1056.403407910" lastFinishedPulling="2026-03-14 09:14:40.558814799 +0000 UTC m=+1086.071507067" observedRunningTime="2026-03-14 09:14:47.12571396 +0000 UTC m=+1092.638406228" watchObservedRunningTime="2026-03-14 09:14:47.125810823 +0000 UTC m=+1092.638503091" Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.692624 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" event={"ID":"fdc31f77-84df-4657-b5ec-a7fcd8b673e2","Type":"ContainerStarted","Data":"1ebacbac014c3353ebe2829cb4c9a1d9fcd8daa016d4c826436f30579227ee53"} Mar 14 09:14:47 crc kubenswrapper[4956]: I0314 09:14:47.693263 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.251972 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-59b5586c67-c9k5q" Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.281683 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" podStartSLOduration=7.411624394 podStartE2EDuration="43.281661615s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.878712673 +0000 UTC m=+1056.391404941" lastFinishedPulling="2026-03-14 09:14:46.748749894 +0000 UTC m=+1092.261442162" observedRunningTime="2026-03-14 09:14:47.730353257 +0000 UTC m=+1093.243045525" watchObservedRunningTime="2026-03-14 09:14:51.281661615 +0000 UTC m=+1096.794353893" Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.758052 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" event={"ID":"67182d33-3abc-4661-8614-94238efc9e45","Type":"ContainerStarted","Data":"f2b0ae2dd5a15afb2e4689e5cfab23dd6bc1f7589cb45cc519eef393184d146e"} Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.758147 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.760518 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" event={"ID":"e43b6f19-e463-43ce-9efe-5cefa3b53682","Type":"ContainerStarted","Data":"1bfe5bf6df565b97103c6ef943fb2dab9b9f7d0195d4e08a6f85d45900fa8452"} Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.798945 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" podStartSLOduration=37.718396187 podStartE2EDuration="43.79892672s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:44.927762163 +0000 UTC m=+1090.440454431" lastFinishedPulling="2026-03-14 09:14:51.008292706 +0000 UTC m=+1096.520984964" observedRunningTime="2026-03-14 09:14:51.791881705 +0000 UTC m=+1097.304573983" watchObservedRunningTime="2026-03-14 09:14:51.79892672 +0000 UTC m=+1097.311618988" Mar 14 09:14:51 crc kubenswrapper[4956]: I0314 09:14:51.814610 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" podStartSLOduration=38.258607834 podStartE2EDuration="43.81458887s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:45.452860033 +0000 UTC m=+1090.965552301" lastFinishedPulling="2026-03-14 09:14:51.008841069 +0000 UTC m=+1096.521533337" observedRunningTime="2026-03-14 09:14:51.810213101 +0000 UTC m=+1097.322905379" watchObservedRunningTime="2026-03-14 09:14:51.81458887 +0000 UTC m=+1097.327281138" Mar 14 09:14:52 crc kubenswrapper[4956]: I0314 09:14:52.767872 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:14:55 crc kubenswrapper[4956]: E0314 09:14:55.214588 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" podUID="f1b524ae-60fa-48a7-aa07-bf354bd2ff62" Mar 14 09:14:57 crc kubenswrapper[4956]: E0314 09:14:57.210724 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" podUID="3d61bf91-4992-47ad-8a53-e823a71d3f9c" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.630149 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-cb6d66846-b5jzw" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.686364 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9c8c85cd7-8xwbp" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.720997 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-74d565fbd5-c924b" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.731959 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d6bd468b-db2v8" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.745463 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9475cdd7-586s5" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.812374 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.836773 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-68f8d496f8-4knlv" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.843167 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6f6f57b9b6-9c7lm" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.907509 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64768694d-m9dwm" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.951776 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-645c9f6488-9bn7h" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.978362 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-744456f686-nnfnk" Mar 14 09:14:58 crc kubenswrapper[4956]: I0314 09:14:58.995572 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-58ff56fcc7-9kjv4" Mar 14 09:14:59 crc kubenswrapper[4956]: I0314 09:14:59.036426 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7cf9f49d6-qwdtd" Mar 14 09:14:59 crc kubenswrapper[4956]: I0314 09:14:59.112605 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-848d74f969-6n97h" Mar 14 09:14:59 crc kubenswrapper[4956]: I0314 09:14:59.180358 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f7469dbc6-6djhw" Mar 14 09:14:59 crc kubenswrapper[4956]: E0314 09:14:59.211082 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.153:5001/openstack-k8s-operators/watcher-operator:3fad4a9eb56718f26ce2ec186bb570f2695f01c3\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" Mar 14 09:14:59 crc kubenswrapper[4956]: I0314 09:14:59.248962 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6646df7cdb-tspwx" Mar 14 09:14:59 crc kubenswrapper[4956]: I0314 09:14:59.432241 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-b5c469fd-bffxq" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.154570 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8"] Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.155787 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.157882 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.160827 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.167105 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8"] Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.203531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e218bcb6-6204-4e1f-96a8-958d7f6460d5-config-volume\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.203580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e218bcb6-6204-4e1f-96a8-958d7f6460d5-secret-volume\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.203605 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9njn\" (UniqueName: \"kubernetes.io/projected/e218bcb6-6204-4e1f-96a8-958d7f6460d5-kube-api-access-z9njn\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.304697 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e218bcb6-6204-4e1f-96a8-958d7f6460d5-secret-volume\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.304754 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9njn\" (UniqueName: \"kubernetes.io/projected/e218bcb6-6204-4e1f-96a8-958d7f6460d5-kube-api-access-z9njn\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.304861 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e218bcb6-6204-4e1f-96a8-958d7f6460d5-config-volume\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.305721 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e218bcb6-6204-4e1f-96a8-958d7f6460d5-config-volume\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.311867 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e218bcb6-6204-4e1f-96a8-958d7f6460d5-secret-volume\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.321354 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9njn\" (UniqueName: \"kubernetes.io/projected/e218bcb6-6204-4e1f-96a8-958d7f6460d5-kube-api-access-z9njn\") pod \"collect-profiles-29557995-d6pk8\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.486579 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.594531 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-fbfb5bd65-v7cqm" Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.733794 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8"] Mar 14 09:15:00 crc kubenswrapper[4956]: W0314 09:15:00.736918 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode218bcb6_6204_4e1f_96a8_958d7f6460d5.slice/crio-bd103589c37708026e0b240fe891fa7a604b743b6686d44ff7ec3ee0487c74c5 WatchSource:0}: Error finding container bd103589c37708026e0b240fe891fa7a604b743b6686d44ff7ec3ee0487c74c5: Status 404 returned error can't find the container with id bd103589c37708026e0b240fe891fa7a604b743b6686d44ff7ec3ee0487c74c5 Mar 14 09:15:00 crc kubenswrapper[4956]: I0314 09:15:00.829805 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" event={"ID":"e218bcb6-6204-4e1f-96a8-958d7f6460d5","Type":"ContainerStarted","Data":"bd103589c37708026e0b240fe891fa7a604b743b6686d44ff7ec3ee0487c74c5"} Mar 14 09:15:01 crc kubenswrapper[4956]: I0314 09:15:01.837317 4956 generic.go:334] "Generic (PLEG): container finished" podID="e218bcb6-6204-4e1f-96a8-958d7f6460d5" containerID="8ebe0b8076f779348c40b6e0de7de31bdc06fa03c1853f1e33105fa0ec8968e6" exitCode=0 Mar 14 09:15:01 crc kubenswrapper[4956]: I0314 09:15:01.837363 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" event={"ID":"e218bcb6-6204-4e1f-96a8-958d7f6460d5","Type":"ContainerDied","Data":"8ebe0b8076f779348c40b6e0de7de31bdc06fa03c1853f1e33105fa0ec8968e6"} Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.118414 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.257335 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e218bcb6-6204-4e1f-96a8-958d7f6460d5-config-volume\") pod \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.257378 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e218bcb6-6204-4e1f-96a8-958d7f6460d5-secret-volume\") pod \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.257494 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9njn\" (UniqueName: \"kubernetes.io/projected/e218bcb6-6204-4e1f-96a8-958d7f6460d5-kube-api-access-z9njn\") pod \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\" (UID: \"e218bcb6-6204-4e1f-96a8-958d7f6460d5\") " Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.259002 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e218bcb6-6204-4e1f-96a8-958d7f6460d5-config-volume" (OuterVolumeSpecName: "config-volume") pod "e218bcb6-6204-4e1f-96a8-958d7f6460d5" (UID: "e218bcb6-6204-4e1f-96a8-958d7f6460d5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.264285 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e218bcb6-6204-4e1f-96a8-958d7f6460d5-kube-api-access-z9njn" (OuterVolumeSpecName: "kube-api-access-z9njn") pod "e218bcb6-6204-4e1f-96a8-958d7f6460d5" (UID: "e218bcb6-6204-4e1f-96a8-958d7f6460d5"). InnerVolumeSpecName "kube-api-access-z9njn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.265128 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e218bcb6-6204-4e1f-96a8-958d7f6460d5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e218bcb6-6204-4e1f-96a8-958d7f6460d5" (UID: "e218bcb6-6204-4e1f-96a8-958d7f6460d5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.359420 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9njn\" (UniqueName: \"kubernetes.io/projected/e218bcb6-6204-4e1f-96a8-958d7f6460d5-kube-api-access-z9njn\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.359458 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e218bcb6-6204-4e1f-96a8-958d7f6460d5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.359469 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e218bcb6-6204-4e1f-96a8-958d7f6460d5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.853465 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" event={"ID":"e218bcb6-6204-4e1f-96a8-958d7f6460d5","Type":"ContainerDied","Data":"bd103589c37708026e0b240fe891fa7a604b743b6686d44ff7ec3ee0487c74c5"} Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.853514 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-d6pk8" Mar 14 09:15:03 crc kubenswrapper[4956]: I0314 09:15:03.853518 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd103589c37708026e0b240fe891fa7a604b743b6686d44ff7ec3ee0487c74c5" Mar 14 09:15:04 crc kubenswrapper[4956]: I0314 09:15:04.682042 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph" Mar 14 09:15:10 crc kubenswrapper[4956]: I0314 09:15:10.907219 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" event={"ID":"f1b524ae-60fa-48a7-aa07-bf354bd2ff62","Type":"ContainerStarted","Data":"79383db8e1964fa39ef4bb91760cfb9ea9ee959b27863c09bb5f4cfb64138bcd"} Mar 14 09:15:10 crc kubenswrapper[4956]: I0314 09:15:10.908011 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:15:10 crc kubenswrapper[4956]: I0314 09:15:10.922726 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" podStartSLOduration=3.071630232 podStartE2EDuration="1m2.922703279s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.907574991 +0000 UTC m=+1056.420267259" lastFinishedPulling="2026-03-14 09:15:10.758648038 +0000 UTC m=+1116.271340306" observedRunningTime="2026-03-14 09:15:10.92237052 +0000 UTC m=+1116.435062788" watchObservedRunningTime="2026-03-14 09:15:10.922703279 +0000 UTC m=+1116.435395557" Mar 14 09:15:11 crc kubenswrapper[4956]: I0314 09:15:11.915798 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" event={"ID":"c5b17a07-5a25-4d01-80fb-bdbc8547cda7","Type":"ContainerStarted","Data":"79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea"} Mar 14 09:15:11 crc kubenswrapper[4956]: I0314 09:15:11.917112 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:15:11 crc kubenswrapper[4956]: I0314 09:15:11.933846 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podStartSLOduration=3.565048553 podStartE2EDuration="1m3.933828037s" podCreationTimestamp="2026-03-14 09:14:08 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.912578275 +0000 UTC m=+1056.425270543" lastFinishedPulling="2026-03-14 09:15:11.281357759 +0000 UTC m=+1116.794050027" observedRunningTime="2026-03-14 09:15:11.930212207 +0000 UTC m=+1117.442904475" watchObservedRunningTime="2026-03-14 09:15:11.933828037 +0000 UTC m=+1117.446520295" Mar 14 09:15:12 crc kubenswrapper[4956]: I0314 09:15:12.923215 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" event={"ID":"3d61bf91-4992-47ad-8a53-e823a71d3f9c","Type":"ContainerStarted","Data":"39571ca34aab802cae7ade8da29eaca31cfddb0cc9ab560b38c8753ce6bfde2c"} Mar 14 09:15:12 crc kubenswrapper[4956]: I0314 09:15:12.937956 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vppb7" podStartSLOduration=2.242951512 podStartE2EDuration="1m3.937935491s" podCreationTimestamp="2026-03-14 09:14:09 +0000 UTC" firstStartedPulling="2026-03-14 09:14:10.923716922 +0000 UTC m=+1056.436409190" lastFinishedPulling="2026-03-14 09:15:12.618700901 +0000 UTC m=+1118.131393169" observedRunningTime="2026-03-14 09:15:12.937534561 +0000 UTC m=+1118.450226839" watchObservedRunningTime="2026-03-14 09:15:12.937935491 +0000 UTC m=+1118.450627759" Mar 14 09:15:19 crc kubenswrapper[4956]: I0314 09:15:19.401845 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:15:19 crc kubenswrapper[4956]: I0314 09:15:19.402615 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r9dtz" Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.076443 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd"] Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.077025 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" containerName="manager" containerID="cri-o://79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea" gracePeriod=10 Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.487917 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.577418 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q"] Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.577705 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" podUID="abca7d4b-14a7-431e-8b05-66a118ab327e" containerName="operator" containerID="cri-o://dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6" gracePeriod=10 Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.655651 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwtw8\" (UniqueName: \"kubernetes.io/projected/c5b17a07-5a25-4d01-80fb-bdbc8547cda7-kube-api-access-fwtw8\") pod \"c5b17a07-5a25-4d01-80fb-bdbc8547cda7\" (UID: \"c5b17a07-5a25-4d01-80fb-bdbc8547cda7\") " Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.671504 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b17a07-5a25-4d01-80fb-bdbc8547cda7-kube-api-access-fwtw8" (OuterVolumeSpecName: "kube-api-access-fwtw8") pod "c5b17a07-5a25-4d01-80fb-bdbc8547cda7" (UID: "c5b17a07-5a25-4d01-80fb-bdbc8547cda7"). InnerVolumeSpecName "kube-api-access-fwtw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.757614 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwtw8\" (UniqueName: \"kubernetes.io/projected/c5b17a07-5a25-4d01-80fb-bdbc8547cda7-kube-api-access-fwtw8\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:24 crc kubenswrapper[4956]: I0314 09:15:24.977442 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.016781 4956 generic.go:334] "Generic (PLEG): container finished" podID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" containerID="79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea" exitCode=0 Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.016838 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" event={"ID":"c5b17a07-5a25-4d01-80fb-bdbc8547cda7","Type":"ContainerDied","Data":"79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea"} Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.016870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" event={"ID":"c5b17a07-5a25-4d01-80fb-bdbc8547cda7","Type":"ContainerDied","Data":"caae42fb64ee35e863ba7204f959bc9549152726a1d6bc1f6f079dd4c6aa2e71"} Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.016885 4956 scope.go:117] "RemoveContainer" containerID="79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.017007 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.023853 4956 generic.go:334] "Generic (PLEG): container finished" podID="abca7d4b-14a7-431e-8b05-66a118ab327e" containerID="dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6" exitCode=0 Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.023900 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" event={"ID":"abca7d4b-14a7-431e-8b05-66a118ab327e","Type":"ContainerDied","Data":"dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6"} Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.023933 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" event={"ID":"abca7d4b-14a7-431e-8b05-66a118ab327e","Type":"ContainerDied","Data":"a726edc093df3219d78619d68d8889556fe20f42f4b52a497efd0ef37bb39de7"} Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.023996 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.056475 4956 scope.go:117] "RemoveContainer" containerID="79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea" Mar 14 09:15:25 crc kubenswrapper[4956]: E0314 09:15:25.057367 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea\": container with ID starting with 79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea not found: ID does not exist" containerID="79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.057399 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea"} err="failed to get container status \"79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea\": rpc error: code = NotFound desc = could not find container \"79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea\": container with ID starting with 79f6552c004bf40b6025a9635119ac76d64d6a4e137b8f9cdd8e0b57166842ea not found: ID does not exist" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.057419 4956 scope.go:117] "RemoveContainer" containerID="dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.060387 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd"] Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.070550 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvw59\" (UniqueName: \"kubernetes.io/projected/abca7d4b-14a7-431e-8b05-66a118ab327e-kube-api-access-dvw59\") pod \"abca7d4b-14a7-431e-8b05-66a118ab327e\" (UID: \"abca7d4b-14a7-431e-8b05-66a118ab327e\") " Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.078330 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abca7d4b-14a7-431e-8b05-66a118ab327e-kube-api-access-dvw59" (OuterVolumeSpecName: "kube-api-access-dvw59") pod "abca7d4b-14a7-431e-8b05-66a118ab327e" (UID: "abca7d4b-14a7-431e-8b05-66a118ab327e"). InnerVolumeSpecName "kube-api-access-dvw59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.086417 4956 scope.go:117] "RemoveContainer" containerID="dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6" Mar 14 09:15:25 crc kubenswrapper[4956]: E0314 09:15:25.086977 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6\": container with ID starting with dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6 not found: ID does not exist" containerID="dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.087014 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6"} err="failed to get container status \"dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6\": rpc error: code = NotFound desc = could not find container \"dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6\": container with ID starting with dde289f7a97902fecdf9760c7c08ccfd1189b9cdb07bd91908b112c614ad56e6 not found: ID does not exist" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.094361 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cc8dbcb54-tckbd"] Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.175370 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvw59\" (UniqueName: \"kubernetes.io/projected/abca7d4b-14a7-431e-8b05-66a118ab327e-kube-api-access-dvw59\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.219896 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" path="/var/lib/kubelet/pods/c5b17a07-5a25-4d01-80fb-bdbc8547cda7/volumes" Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.345566 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q"] Mar 14 09:15:25 crc kubenswrapper[4956]: I0314 09:15:25.350733 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ccbf6d758-tgm5q"] Mar 14 09:15:27 crc kubenswrapper[4956]: I0314 09:15:27.217733 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abca7d4b-14a7-431e-8b05-66a118ab327e" path="/var/lib/kubelet/pods/abca7d4b-14a7-431e-8b05-66a118ab327e/volumes" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.590824 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-jl5p2"] Mar 14 09:15:28 crc kubenswrapper[4956]: E0314 09:15:28.591448 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e218bcb6-6204-4e1f-96a8-958d7f6460d5" containerName="collect-profiles" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.591464 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e218bcb6-6204-4e1f-96a8-958d7f6460d5" containerName="collect-profiles" Mar 14 09:15:28 crc kubenswrapper[4956]: E0314 09:15:28.591509 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" containerName="manager" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.591519 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" containerName="manager" Mar 14 09:15:28 crc kubenswrapper[4956]: E0314 09:15:28.591535 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abca7d4b-14a7-431e-8b05-66a118ab327e" containerName="operator" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.591543 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="abca7d4b-14a7-431e-8b05-66a118ab327e" containerName="operator" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.591713 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="abca7d4b-14a7-431e-8b05-66a118ab327e" containerName="operator" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.591744 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e218bcb6-6204-4e1f-96a8-958d7f6460d5" containerName="collect-profiles" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.591755 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b17a07-5a25-4d01-80fb-bdbc8547cda7" containerName="manager" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.592336 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.594573 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-index-dockercfg-6xhrj" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.601275 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-jl5p2"] Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.724872 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdq9q\" (UniqueName: \"kubernetes.io/projected/75e6f1a9-bb4c-496d-841c-849bf8a375d7-kube-api-access-cdq9q\") pod \"watcher-operator-index-jl5p2\" (UID: \"75e6f1a9-bb4c-496d-841c-849bf8a375d7\") " pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.825895 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdq9q\" (UniqueName: \"kubernetes.io/projected/75e6f1a9-bb4c-496d-841c-849bf8a375d7-kube-api-access-cdq9q\") pod \"watcher-operator-index-jl5p2\" (UID: \"75e6f1a9-bb4c-496d-841c-849bf8a375d7\") " pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:28 crc kubenswrapper[4956]: I0314 09:15:28.916543 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdq9q\" (UniqueName: \"kubernetes.io/projected/75e6f1a9-bb4c-496d-841c-849bf8a375d7-kube-api-access-cdq9q\") pod \"watcher-operator-index-jl5p2\" (UID: \"75e6f1a9-bb4c-496d-841c-849bf8a375d7\") " pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:29 crc kubenswrapper[4956]: I0314 09:15:29.208693 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:29 crc kubenswrapper[4956]: W0314 09:15:29.762136 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e6f1a9_bb4c_496d_841c_849bf8a375d7.slice/crio-b34f2e8d7774968ea0d964aa80b98bf8e2df0b1528715df152bd7b42d3aa6e68 WatchSource:0}: Error finding container b34f2e8d7774968ea0d964aa80b98bf8e2df0b1528715df152bd7b42d3aa6e68: Status 404 returned error can't find the container with id b34f2e8d7774968ea0d964aa80b98bf8e2df0b1528715df152bd7b42d3aa6e68 Mar 14 09:15:29 crc kubenswrapper[4956]: I0314 09:15:29.769862 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-jl5p2"] Mar 14 09:15:30 crc kubenswrapper[4956]: I0314 09:15:30.063254 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-jl5p2" event={"ID":"75e6f1a9-bb4c-496d-841c-849bf8a375d7","Type":"ContainerStarted","Data":"b34f2e8d7774968ea0d964aa80b98bf8e2df0b1528715df152bd7b42d3aa6e68"} Mar 14 09:15:31 crc kubenswrapper[4956]: I0314 09:15:31.072675 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-jl5p2" event={"ID":"75e6f1a9-bb4c-496d-841c-849bf8a375d7","Type":"ContainerStarted","Data":"d051a265d4f535e3183b5bb796ca078031859cf323d53273a2834e369acee899"} Mar 14 09:15:31 crc kubenswrapper[4956]: I0314 09:15:31.100896 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-jl5p2" podStartSLOduration=2.939048041 podStartE2EDuration="3.100866756s" podCreationTimestamp="2026-03-14 09:15:28 +0000 UTC" firstStartedPulling="2026-03-14 09:15:29.765038232 +0000 UTC m=+1135.277730500" lastFinishedPulling="2026-03-14 09:15:29.926856957 +0000 UTC m=+1135.439549215" observedRunningTime="2026-03-14 09:15:31.091649346 +0000 UTC m=+1136.604341614" watchObservedRunningTime="2026-03-14 09:15:31.100866756 +0000 UTC m=+1136.613559064" Mar 14 09:15:39 crc kubenswrapper[4956]: I0314 09:15:39.225900 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:39 crc kubenswrapper[4956]: I0314 09:15:39.226373 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:39 crc kubenswrapper[4956]: I0314 09:15:39.240936 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:40 crc kubenswrapper[4956]: I0314 09:15:40.233714 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-index-jl5p2" Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.834074 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr"] Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.836183 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.843667 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr"] Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.843948 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nndhm" Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.898667 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-bundle\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.898768 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-util\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:47 crc kubenswrapper[4956]: I0314 09:15:47.898797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh5cd\" (UniqueName: \"kubernetes.io/projected/07597b31-3cd5-4c5c-8bb1-65366038ddbb-kube-api-access-jh5cd\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.000446 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-bundle\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.000583 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-util\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.000620 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh5cd\" (UniqueName: \"kubernetes.io/projected/07597b31-3cd5-4c5c-8bb1-65366038ddbb-kube-api-access-jh5cd\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.001450 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-bundle\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.001510 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-util\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.024401 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh5cd\" (UniqueName: \"kubernetes.io/projected/07597b31-3cd5-4c5c-8bb1-65366038ddbb-kube-api-access-jh5cd\") pod \"632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.161836 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:48 crc kubenswrapper[4956]: I0314 09:15:48.699753 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr"] Mar 14 09:15:49 crc kubenswrapper[4956]: I0314 09:15:49.200037 4956 generic.go:334] "Generic (PLEG): container finished" podID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerID="ecff92e25421293e2d8ddf77ad1df8b2b8fb55e9764ff559b1978a3b7f579c24" exitCode=0 Mar 14 09:15:49 crc kubenswrapper[4956]: I0314 09:15:49.200084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" event={"ID":"07597b31-3cd5-4c5c-8bb1-65366038ddbb","Type":"ContainerDied","Data":"ecff92e25421293e2d8ddf77ad1df8b2b8fb55e9764ff559b1978a3b7f579c24"} Mar 14 09:15:49 crc kubenswrapper[4956]: I0314 09:15:49.200115 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" event={"ID":"07597b31-3cd5-4c5c-8bb1-65366038ddbb","Type":"ContainerStarted","Data":"e0297d1dbbfbf0b9fca01cb0aca5d28207dac3e69cb5184a811aa4e464f35586"} Mar 14 09:15:50 crc kubenswrapper[4956]: I0314 09:15:50.213228 4956 generic.go:334] "Generic (PLEG): container finished" podID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerID="545ef798f31cce12d4fe2c337948fb3743de81503708c32f26089580a16a17ba" exitCode=0 Mar 14 09:15:50 crc kubenswrapper[4956]: I0314 09:15:50.213323 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" event={"ID":"07597b31-3cd5-4c5c-8bb1-65366038ddbb","Type":"ContainerDied","Data":"545ef798f31cce12d4fe2c337948fb3743de81503708c32f26089580a16a17ba"} Mar 14 09:15:51 crc kubenswrapper[4956]: I0314 09:15:51.224551 4956 generic.go:334] "Generic (PLEG): container finished" podID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerID="966b12c8c05efa733e51841cf577e17bf3c72782117828548c19c48c2f7da481" exitCode=0 Mar 14 09:15:51 crc kubenswrapper[4956]: I0314 09:15:51.224643 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" event={"ID":"07597b31-3cd5-4c5c-8bb1-65366038ddbb","Type":"ContainerDied","Data":"966b12c8c05efa733e51841cf577e17bf3c72782117828548c19c48c2f7da481"} Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.530962 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.665015 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-bundle\") pod \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.665074 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh5cd\" (UniqueName: \"kubernetes.io/projected/07597b31-3cd5-4c5c-8bb1-65366038ddbb-kube-api-access-jh5cd\") pod \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.665107 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-util\") pod \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\" (UID: \"07597b31-3cd5-4c5c-8bb1-65366038ddbb\") " Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.666027 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-bundle" (OuterVolumeSpecName: "bundle") pod "07597b31-3cd5-4c5c-8bb1-65366038ddbb" (UID: "07597b31-3cd5-4c5c-8bb1-65366038ddbb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.668047 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.670457 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07597b31-3cd5-4c5c-8bb1-65366038ddbb-kube-api-access-jh5cd" (OuterVolumeSpecName: "kube-api-access-jh5cd") pod "07597b31-3cd5-4c5c-8bb1-65366038ddbb" (UID: "07597b31-3cd5-4c5c-8bb1-65366038ddbb"). InnerVolumeSpecName "kube-api-access-jh5cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.677952 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-util" (OuterVolumeSpecName: "util") pod "07597b31-3cd5-4c5c-8bb1-65366038ddbb" (UID: "07597b31-3cd5-4c5c-8bb1-65366038ddbb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.769088 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh5cd\" (UniqueName: \"kubernetes.io/projected/07597b31-3cd5-4c5c-8bb1-65366038ddbb-kube-api-access-jh5cd\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:52 crc kubenswrapper[4956]: I0314 09:15:52.769131 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07597b31-3cd5-4c5c-8bb1-65366038ddbb-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:53 crc kubenswrapper[4956]: I0314 09:15:53.244269 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" event={"ID":"07597b31-3cd5-4c5c-8bb1-65366038ddbb","Type":"ContainerDied","Data":"e0297d1dbbfbf0b9fca01cb0aca5d28207dac3e69cb5184a811aa4e464f35586"} Mar 14 09:15:53 crc kubenswrapper[4956]: I0314 09:15:53.244310 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0297d1dbbfbf0b9fca01cb0aca5d28207dac3e69cb5184a811aa4e464f35586" Mar 14 09:15:53 crc kubenswrapper[4956]: I0314 09:15:53.244338 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr" Mar 14 09:15:55 crc kubenswrapper[4956]: I0314 09:15:55.424295 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:15:55 crc kubenswrapper[4956]: I0314 09:15:55.424644 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.385416 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn"] Mar 14 09:15:59 crc kubenswrapper[4956]: E0314 09:15:59.386010 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="util" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.386022 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="util" Mar 14 09:15:59 crc kubenswrapper[4956]: E0314 09:15:59.386034 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="pull" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.386039 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="pull" Mar 14 09:15:59 crc kubenswrapper[4956]: E0314 09:15:59.386049 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="extract" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.386055 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="extract" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.386198 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="07597b31-3cd5-4c5c-8bb1-65366038ddbb" containerName="extract" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.386685 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.389924 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-service-cert" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.390101 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-w27pd" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.397825 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn"] Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.458874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-apiservice-cert\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.459174 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-webhook-cert\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.459209 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfzms\" (UniqueName: \"kubernetes.io/projected/fa26bd7c-f56d-4c2e-917f-082b11b312d8-kube-api-access-tfzms\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.561349 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-apiservice-cert\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.561450 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-webhook-cert\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.561588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfzms\" (UniqueName: \"kubernetes.io/projected/fa26bd7c-f56d-4c2e-917f-082b11b312d8-kube-api-access-tfzms\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.567071 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-apiservice-cert\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.567559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-webhook-cert\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.577881 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfzms\" (UniqueName: \"kubernetes.io/projected/fa26bd7c-f56d-4c2e-917f-082b11b312d8-kube-api-access-tfzms\") pod \"watcher-operator-controller-manager-84884ff888-gvpxn\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:15:59 crc kubenswrapper[4956]: I0314 09:15:59.706281 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.140916 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn"] Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.148911 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557996-wjwrm"] Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.150081 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.155657 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.155857 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.156110 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.161323 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-wjwrm"] Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.277254 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxlg5\" (UniqueName: \"kubernetes.io/projected/f3305c86-b3b7-4e20-8b6c-8cb514addf0e-kube-api-access-kxlg5\") pod \"auto-csr-approver-29557996-wjwrm\" (UID: \"f3305c86-b3b7-4e20-8b6c-8cb514addf0e\") " pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.295332 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" event={"ID":"fa26bd7c-f56d-4c2e-917f-082b11b312d8","Type":"ContainerStarted","Data":"7957ca36e68b703436800ee278318b72677b4116cace07f33b828179acd857d1"} Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.378216 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxlg5\" (UniqueName: \"kubernetes.io/projected/f3305c86-b3b7-4e20-8b6c-8cb514addf0e-kube-api-access-kxlg5\") pod \"auto-csr-approver-29557996-wjwrm\" (UID: \"f3305c86-b3b7-4e20-8b6c-8cb514addf0e\") " pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.396006 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxlg5\" (UniqueName: \"kubernetes.io/projected/f3305c86-b3b7-4e20-8b6c-8cb514addf0e-kube-api-access-kxlg5\") pod \"auto-csr-approver-29557996-wjwrm\" (UID: \"f3305c86-b3b7-4e20-8b6c-8cb514addf0e\") " pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.493903 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:00 crc kubenswrapper[4956]: I0314 09:16:00.899556 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-wjwrm"] Mar 14 09:16:01 crc kubenswrapper[4956]: I0314 09:16:01.303199 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" event={"ID":"f3305c86-b3b7-4e20-8b6c-8cb514addf0e","Type":"ContainerStarted","Data":"eb361fd42b253abf5443c472e92c2a15ab6ab1c2ab52913b59997b530200d0b1"} Mar 14 09:16:01 crc kubenswrapper[4956]: I0314 09:16:01.305140 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" event={"ID":"fa26bd7c-f56d-4c2e-917f-082b11b312d8","Type":"ContainerStarted","Data":"8b92fe7ae09730582cc46f95420c1218eb81be8cb58edfdeaa4412cb290b5d1c"} Mar 14 09:16:01 crc kubenswrapper[4956]: I0314 09:16:01.305503 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:16:01 crc kubenswrapper[4956]: I0314 09:16:01.324810 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" podStartSLOduration=2.324793434 podStartE2EDuration="2.324793434s" podCreationTimestamp="2026-03-14 09:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:16:01.319305098 +0000 UTC m=+1166.831997366" watchObservedRunningTime="2026-03-14 09:16:01.324793434 +0000 UTC m=+1166.837485702" Mar 14 09:16:02 crc kubenswrapper[4956]: I0314 09:16:02.323704 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" event={"ID":"f3305c86-b3b7-4e20-8b6c-8cb514addf0e","Type":"ContainerStarted","Data":"f1269729d86b8766b2385e57e3d5b58b97745a83bc93d518d4e50c55ad709f1a"} Mar 14 09:16:02 crc kubenswrapper[4956]: I0314 09:16:02.340079 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" podStartSLOduration=1.292844337 podStartE2EDuration="2.340064337s" podCreationTimestamp="2026-03-14 09:16:00 +0000 UTC" firstStartedPulling="2026-03-14 09:16:00.908786598 +0000 UTC m=+1166.421478866" lastFinishedPulling="2026-03-14 09:16:01.956006598 +0000 UTC m=+1167.468698866" observedRunningTime="2026-03-14 09:16:02.339507535 +0000 UTC m=+1167.852199803" watchObservedRunningTime="2026-03-14 09:16:02.340064337 +0000 UTC m=+1167.852756605" Mar 14 09:16:03 crc kubenswrapper[4956]: I0314 09:16:03.332353 4956 generic.go:334] "Generic (PLEG): container finished" podID="f3305c86-b3b7-4e20-8b6c-8cb514addf0e" containerID="f1269729d86b8766b2385e57e3d5b58b97745a83bc93d518d4e50c55ad709f1a" exitCode=0 Mar 14 09:16:03 crc kubenswrapper[4956]: I0314 09:16:03.332405 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" event={"ID":"f3305c86-b3b7-4e20-8b6c-8cb514addf0e","Type":"ContainerDied","Data":"f1269729d86b8766b2385e57e3d5b58b97745a83bc93d518d4e50c55ad709f1a"} Mar 14 09:16:04 crc kubenswrapper[4956]: I0314 09:16:04.603572 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:04 crc kubenswrapper[4956]: I0314 09:16:04.743114 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxlg5\" (UniqueName: \"kubernetes.io/projected/f3305c86-b3b7-4e20-8b6c-8cb514addf0e-kube-api-access-kxlg5\") pod \"f3305c86-b3b7-4e20-8b6c-8cb514addf0e\" (UID: \"f3305c86-b3b7-4e20-8b6c-8cb514addf0e\") " Mar 14 09:16:04 crc kubenswrapper[4956]: I0314 09:16:04.749005 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3305c86-b3b7-4e20-8b6c-8cb514addf0e-kube-api-access-kxlg5" (OuterVolumeSpecName: "kube-api-access-kxlg5") pod "f3305c86-b3b7-4e20-8b6c-8cb514addf0e" (UID: "f3305c86-b3b7-4e20-8b6c-8cb514addf0e"). InnerVolumeSpecName "kube-api-access-kxlg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:04 crc kubenswrapper[4956]: I0314 09:16:04.845057 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxlg5\" (UniqueName: \"kubernetes.io/projected/f3305c86-b3b7-4e20-8b6c-8cb514addf0e-kube-api-access-kxlg5\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:05 crc kubenswrapper[4956]: I0314 09:16:05.345365 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" event={"ID":"f3305c86-b3b7-4e20-8b6c-8cb514addf0e","Type":"ContainerDied","Data":"eb361fd42b253abf5443c472e92c2a15ab6ab1c2ab52913b59997b530200d0b1"} Mar 14 09:16:05 crc kubenswrapper[4956]: I0314 09:16:05.345634 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb361fd42b253abf5443c472e92c2a15ab6ab1c2ab52913b59997b530200d0b1" Mar 14 09:16:05 crc kubenswrapper[4956]: I0314 09:16:05.345416 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-wjwrm" Mar 14 09:16:05 crc kubenswrapper[4956]: I0314 09:16:05.398523 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-s9vd6"] Mar 14 09:16:05 crc kubenswrapper[4956]: I0314 09:16:05.404049 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-s9vd6"] Mar 14 09:16:07 crc kubenswrapper[4956]: I0314 09:16:07.234013 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2" path="/var/lib/kubelet/pods/3aa10a3d-74ac-4959-b2fc-64b0dc5c66d2/volumes" Mar 14 09:16:09 crc kubenswrapper[4956]: I0314 09:16:09.710749 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.738302 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd"] Mar 14 09:16:10 crc kubenswrapper[4956]: E0314 09:16:10.738962 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3305c86-b3b7-4e20-8b6c-8cb514addf0e" containerName="oc" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.738978 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3305c86-b3b7-4e20-8b6c-8cb514addf0e" containerName="oc" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.739168 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3305c86-b3b7-4e20-8b6c-8cb514addf0e" containerName="oc" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.739774 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.755315 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd"] Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.835391 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn82p\" (UniqueName: \"kubernetes.io/projected/75ec3cea-39db-4cc8-8065-17259b7dd1e4-kube-api-access-sn82p\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.835444 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75ec3cea-39db-4cc8-8065-17259b7dd1e4-webhook-cert\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.835495 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75ec3cea-39db-4cc8-8065-17259b7dd1e4-apiservice-cert\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.936825 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn82p\" (UniqueName: \"kubernetes.io/projected/75ec3cea-39db-4cc8-8065-17259b7dd1e4-kube-api-access-sn82p\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.936888 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75ec3cea-39db-4cc8-8065-17259b7dd1e4-webhook-cert\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.936920 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75ec3cea-39db-4cc8-8065-17259b7dd1e4-apiservice-cert\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.942367 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75ec3cea-39db-4cc8-8065-17259b7dd1e4-apiservice-cert\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.942388 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75ec3cea-39db-4cc8-8065-17259b7dd1e4-webhook-cert\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:10 crc kubenswrapper[4956]: I0314 09:16:10.953775 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn82p\" (UniqueName: \"kubernetes.io/projected/75ec3cea-39db-4cc8-8065-17259b7dd1e4-kube-api-access-sn82p\") pod \"watcher-operator-controller-manager-788cc4b948-xgdnd\" (UID: \"75ec3cea-39db-4cc8-8065-17259b7dd1e4\") " pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:11 crc kubenswrapper[4956]: I0314 09:16:11.061735 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:11 crc kubenswrapper[4956]: I0314 09:16:11.485785 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd"] Mar 14 09:16:12 crc kubenswrapper[4956]: I0314 09:16:12.403906 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" event={"ID":"75ec3cea-39db-4cc8-8065-17259b7dd1e4","Type":"ContainerStarted","Data":"892c1d494f63f415ba6a1a48380e4611fbe2aab0686b1cf23406b3655e34d529"} Mar 14 09:16:12 crc kubenswrapper[4956]: I0314 09:16:12.404274 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" event={"ID":"75ec3cea-39db-4cc8-8065-17259b7dd1e4","Type":"ContainerStarted","Data":"6ffe74ec039178135ce0f9bcc9d51f7b21bc7e9acf99b9cefd280bf4b2e1cee5"} Mar 14 09:16:12 crc kubenswrapper[4956]: I0314 09:16:12.404691 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.068055 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.089618 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-788cc4b948-xgdnd" podStartSLOduration=11.089598061 podStartE2EDuration="11.089598061s" podCreationTimestamp="2026-03-14 09:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:16:12.428525708 +0000 UTC m=+1177.941217976" watchObservedRunningTime="2026-03-14 09:16:21.089598061 +0000 UTC m=+1186.602290339" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.118964 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn"] Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.119212 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" podUID="fa26bd7c-f56d-4c2e-917f-082b11b312d8" containerName="manager" containerID="cri-o://8b92fe7ae09730582cc46f95420c1218eb81be8cb58edfdeaa4412cb290b5d1c" gracePeriod=10 Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.467258 4956 generic.go:334] "Generic (PLEG): container finished" podID="fa26bd7c-f56d-4c2e-917f-082b11b312d8" containerID="8b92fe7ae09730582cc46f95420c1218eb81be8cb58edfdeaa4412cb290b5d1c" exitCode=0 Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.467322 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" event={"ID":"fa26bd7c-f56d-4c2e-917f-082b11b312d8","Type":"ContainerDied","Data":"8b92fe7ae09730582cc46f95420c1218eb81be8cb58edfdeaa4412cb290b5d1c"} Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.536832 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.582503 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfzms\" (UniqueName: \"kubernetes.io/projected/fa26bd7c-f56d-4c2e-917f-082b11b312d8-kube-api-access-tfzms\") pod \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.582585 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-apiservice-cert\") pod \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.582661 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-webhook-cert\") pod \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\" (UID: \"fa26bd7c-f56d-4c2e-917f-082b11b312d8\") " Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.589001 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "fa26bd7c-f56d-4c2e-917f-082b11b312d8" (UID: "fa26bd7c-f56d-4c2e-917f-082b11b312d8"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.589005 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa26bd7c-f56d-4c2e-917f-082b11b312d8-kube-api-access-tfzms" (OuterVolumeSpecName: "kube-api-access-tfzms") pod "fa26bd7c-f56d-4c2e-917f-082b11b312d8" (UID: "fa26bd7c-f56d-4c2e-917f-082b11b312d8"). InnerVolumeSpecName "kube-api-access-tfzms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.590908 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "fa26bd7c-f56d-4c2e-917f-082b11b312d8" (UID: "fa26bd7c-f56d-4c2e-917f-082b11b312d8"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.684040 4956 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.684070 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfzms\" (UniqueName: \"kubernetes.io/projected/fa26bd7c-f56d-4c2e-917f-082b11b312d8-kube-api-access-tfzms\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:21 crc kubenswrapper[4956]: I0314 09:16:21.684080 4956 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa26bd7c-f56d-4c2e-917f-082b11b312d8-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:22 crc kubenswrapper[4956]: I0314 09:16:22.477605 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" event={"ID":"fa26bd7c-f56d-4c2e-917f-082b11b312d8","Type":"ContainerDied","Data":"7957ca36e68b703436800ee278318b72677b4116cace07f33b828179acd857d1"} Mar 14 09:16:22 crc kubenswrapper[4956]: I0314 09:16:22.477997 4956 scope.go:117] "RemoveContainer" containerID="8b92fe7ae09730582cc46f95420c1218eb81be8cb58edfdeaa4412cb290b5d1c" Mar 14 09:16:22 crc kubenswrapper[4956]: I0314 09:16:22.477695 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn" Mar 14 09:16:22 crc kubenswrapper[4956]: I0314 09:16:22.513086 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn"] Mar 14 09:16:22 crc kubenswrapper[4956]: I0314 09:16:22.518325 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-84884ff888-gvpxn"] Mar 14 09:16:23 crc kubenswrapper[4956]: I0314 09:16:23.223664 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa26bd7c-f56d-4c2e-917f-082b11b312d8" path="/var/lib/kubelet/pods/fa26bd7c-f56d-4c2e-917f-082b11b312d8/volumes" Mar 14 09:16:25 crc kubenswrapper[4956]: I0314 09:16:25.423370 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:16:25 crc kubenswrapper[4956]: I0314 09:16:25.423717 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.340523 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Mar 14 09:16:33 crc kubenswrapper[4956]: E0314 09:16:33.341945 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa26bd7c-f56d-4c2e-917f-082b11b312d8" containerName="manager" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.341975 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa26bd7c-f56d-4c2e-917f-082b11b312d8" containerName="manager" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.342153 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa26bd7c-f56d-4c2e-917f-082b11b312d8" containerName="manager" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.343055 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.345197 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-erlang-cookie" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.345203 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-default-user" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.347456 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-server-conf" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.347702 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-config-data" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.347959 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openshift-service-ca.crt" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.348022 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-svc" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.348148 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"kube-root-ca.crt" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.348164 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-server-dockercfg-4x9pf" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.348238 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-plugins-conf" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.355336 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.449473 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-config-data\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.449555 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.449600 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.449644 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.449781 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/28cef006-3a3a-464f-b6d0-9faea75b0a9e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.449835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/28cef006-3a3a-464f-b6d0-9faea75b0a9e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.450013 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6pt\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-kube-api-access-gn6pt\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.450045 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.450196 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.450228 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.450255 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551190 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6pt\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-kube-api-access-gn6pt\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551236 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551268 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551292 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551313 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551341 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-config-data\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551371 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551397 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551430 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551453 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/28cef006-3a3a-464f-b6d0-9faea75b0a9e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.551470 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/28cef006-3a3a-464f-b6d0-9faea75b0a9e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.552039 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.552162 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.552452 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.552785 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-config-data\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.553207 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/28cef006-3a3a-464f-b6d0-9faea75b0a9e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.555060 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.555089 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5bb0c427ff6b750fda1d08ae4365deff4fe94edb13108dcde6210b2f62f53695/globalmount\"" pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.557871 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/28cef006-3a3a-464f-b6d0-9faea75b0a9e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.557896 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.557934 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/28cef006-3a3a-464f-b6d0-9faea75b0a9e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.559327 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.571327 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6pt\" (UniqueName: \"kubernetes.io/projected/28cef006-3a3a-464f-b6d0-9faea75b0a9e-kube-api-access-gn6pt\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.587739 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7322bf0d-47be-43b6-8f54-88e3b7c42c65\") pod \"rabbitmq-server-0\" (UID: \"28cef006-3a3a-464f-b6d0-9faea75b0a9e\") " pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.664079 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.906368 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.907804 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.912069 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-config-data" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.912416 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-plugins-conf" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.912649 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-default-user" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.913529 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-notifications-svc" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.913643 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-conf" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.913737 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.913816 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-dockercfg-xrw4c" Mar 14 09:16:33 crc kubenswrapper[4956]: I0314 09:16:33.920139 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.058974 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xnmm\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-kube-api-access-6xnmm\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059026 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059056 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059099 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059304 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059398 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059423 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059445 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059628 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.059651 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.088016 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.097939 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161319 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161372 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161396 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161431 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161453 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161469 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161524 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161543 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161558 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xnmm\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-kube-api-access-6xnmm\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.161609 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.162560 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.162902 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.163059 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.163401 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.164051 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.166153 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.166366 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/28afb892c87bb5d2fe560124a71d7152f66c9ddb883cb191ed698c1c08ead010/globalmount\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.172202 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.172367 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.172503 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.172845 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.186352 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xnmm\" (UniqueName: \"kubernetes.io/projected/ca474341-5bf4-4fa5-bc46-56d42c3ccffd-kube-api-access-6xnmm\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.212733 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e5ae742-6b09-4c93-a84b-29aed38dc971\") pod \"rabbitmq-notifications-server-0\" (UID: \"ca474341-5bf4-4fa5-bc46-56d42c3ccffd\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.242138 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.572300 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"28cef006-3a3a-464f-b6d0-9faea75b0a9e","Type":"ContainerStarted","Data":"444ade92bc6eb6df63fca9d6dc5157c71dbed2fc4ef54a0403f3b90d55b67cb7"} Mar 14 09:16:34 crc kubenswrapper[4956]: I0314 09:16:34.703769 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Mar 14 09:16:34 crc kubenswrapper[4956]: W0314 09:16:34.737874 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca474341_5bf4_4fa5_bc46_56d42c3ccffd.slice/crio-394d88a3bfb03d34bff0ff2960df309d7428fdddd0a674cf70cd7371b14c129d WatchSource:0}: Error finding container 394d88a3bfb03d34bff0ff2960df309d7428fdddd0a674cf70cd7371b14c129d: Status 404 returned error can't find the container with id 394d88a3bfb03d34bff0ff2960df309d7428fdddd0a674cf70cd7371b14c129d Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.260870 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.262892 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.276149 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"galera-openstack-dockercfg-88jb7" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.276300 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config-data" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.276545 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-galera-openstack-svc" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.278503 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-scripts" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.285780 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"combined-ca-bundle" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.309467 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.381965 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382062 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382098 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-kolla-config\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382131 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382154 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382171 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jpj2\" (UniqueName: \"kubernetes.io/projected/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-kube-api-access-9jpj2\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382231 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.382248 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-config-data-default\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.489985 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490064 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490110 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-kolla-config\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490158 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490187 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490207 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jpj2\" (UniqueName: \"kubernetes.io/projected/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-kube-api-access-9jpj2\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490258 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-config-data-default\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.490602 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.491095 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-kolla-config\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.498150 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.499885 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.499947 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ea7a38931fa33ec7a49459cb79a8125cbe04de135d5cca6071e5154d096bd6f/globalmount\"" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.503065 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-config-data-default\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.506031 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.510399 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.515598 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jpj2\" (UniqueName: \"kubernetes.io/projected/19027c02-7e5c-484e-ba9f-b9f9e7c4f81e-kube-api-access-9jpj2\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.579598 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83eb712c-62fd-407b-b439-cd79e20b41fa\") pod \"openstack-galera-0\" (UID: \"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e\") " pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.584976 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"ca474341-5bf4-4fa5-bc46-56d42c3ccffd","Type":"ContainerStarted","Data":"394d88a3bfb03d34bff0ff2960df309d7428fdddd0a674cf70cd7371b14c129d"} Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.602681 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.644224 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.646571 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.649715 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-92tvb" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.649980 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.650103 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.660131 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.811047 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-kolla-config\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.811375 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-config-data\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.811415 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.811444 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.811532 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gp4h\" (UniqueName: \"kubernetes.io/projected/d0b0e4c3-5f9a-4415-9440-f2758780999a-kube-api-access-2gp4h\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.912810 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.914379 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gp4h\" (UniqueName: \"kubernetes.io/projected/d0b0e4c3-5f9a-4415-9440-f2758780999a-kube-api-access-2gp4h\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.914525 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-kolla-config\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.914557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-config-data\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.914632 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.916850 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-config-data\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.916900 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-kolla-config\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.923649 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.948373 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gp4h\" (UniqueName: \"kubernetes.io/projected/d0b0e4c3-5f9a-4415-9440-f2758780999a-kube-api-access-2gp4h\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:35 crc kubenswrapper[4956]: I0314 09:16:35.948743 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.039841 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.048875 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.050170 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.055942 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-8w9fh" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.064872 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.231940 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzvc\" (UniqueName: \"kubernetes.io/projected/dcc01311-198b-436e-b438-5229352baf03-kube-api-access-bzzvc\") pod \"kube-state-metrics-0\" (UID: \"dcc01311-198b-436e-b438-5229352baf03\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.291268 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.334640 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzvc\" (UniqueName: \"kubernetes.io/projected/dcc01311-198b-436e-b438-5229352baf03-kube-api-access-bzzvc\") pod \"kube-state-metrics-0\" (UID: \"dcc01311-198b-436e-b438-5229352baf03\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.364698 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzvc\" (UniqueName: \"kubernetes.io/projected/dcc01311-198b-436e-b438-5229352baf03-kube-api-access-bzzvc\") pod \"kube-state-metrics-0\" (UID: \"dcc01311-198b-436e-b438-5229352baf03\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.429979 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.603013 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e","Type":"ContainerStarted","Data":"5565e354b6770308f959080cbc1e7f833a4743d2e38c572499a919248134eea2"} Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.741135 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.743687 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.746850 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-cluster-tls-config" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.747096 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-generated" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.747216 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-alertmanager-dockercfg-gzk4w" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.755060 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-tls-assets-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.755385 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-web-config" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.755887 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.755926 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4vvv\" (UniqueName: \"kubernetes.io/projected/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-kube-api-access-s4vvv\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.755997 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.756023 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.756063 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.756110 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.756196 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.762950 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.808846 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.861702 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.861778 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.861834 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.861965 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.861991 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4vvv\" (UniqueName: \"kubernetes.io/projected/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-kube-api-access-s4vvv\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.862755 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.862788 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.864391 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.880054 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.884134 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.888321 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.893007 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.899515 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4vvv\" (UniqueName: \"kubernetes.io/projected/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-kube-api-access-s4vvv\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.899767 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:36 crc kubenswrapper[4956]: I0314 09:16:36.985169 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:16:37 crc kubenswrapper[4956]: W0314 09:16:37.015717 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcc01311_198b_436e_b438_5229352baf03.slice/crio-63778119c816b07d5a90abcb435cdeb5d31e8d2c61bbd6168ebd669372cb5549 WatchSource:0}: Error finding container 63778119c816b07d5a90abcb435cdeb5d31e8d2c61bbd6168ebd669372cb5549: Status 404 returned error can't find the container with id 63778119c816b07d5a90abcb435cdeb5d31e8d2c61bbd6168ebd669372cb5549 Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.080854 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.152263 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.154186 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.159443 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-sbs8h" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.171450 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.173473 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.288912 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be4e37c-1708-4556-9ac0-e6daaf8fdadf-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hkwjk\" (UID: \"0be4e37c-1708-4556-9ac0-e6daaf8fdadf\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.289232 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kc4m\" (UniqueName: \"kubernetes.io/projected/0be4e37c-1708-4556-9ac0-e6daaf8fdadf-kube-api-access-8kc4m\") pod \"observability-ui-dashboards-66cbf594b5-hkwjk\" (UID: \"0be4e37c-1708-4556-9ac0-e6daaf8fdadf\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.389203 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.390654 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be4e37c-1708-4556-9ac0-e6daaf8fdadf-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hkwjk\" (UID: \"0be4e37c-1708-4556-9ac0-e6daaf8fdadf\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.391048 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.398164 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be4e37c-1708-4556-9ac0-e6daaf8fdadf-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-hkwjk\" (UID: \"0be4e37c-1708-4556-9ac0-e6daaf8fdadf\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.399027 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kc4m\" (UniqueName: \"kubernetes.io/projected/0be4e37c-1708-4556-9ac0-e6daaf8fdadf-kube-api-access-8kc4m\") pod \"observability-ui-dashboards-66cbf594b5-hkwjk\" (UID: \"0be4e37c-1708-4556-9ac0-e6daaf8fdadf\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.402514 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.402709 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.403331 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-5tqbw" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.403441 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.403341 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-2" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.403669 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-1" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.403406 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.403914 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.431178 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.444718 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kc4m\" (UniqueName: \"kubernetes.io/projected/0be4e37c-1708-4556-9ac0-e6daaf8fdadf-kube-api-access-8kc4m\") pod \"observability-ui-dashboards-66cbf594b5-hkwjk\" (UID: \"0be4e37c-1708-4556-9ac0-e6daaf8fdadf\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.494286 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.602796 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.602861 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.602886 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4tf\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-kube-api-access-9s4tf\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.602940 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.602969 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.603001 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.603018 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.603032 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.603077 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.603114 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.639426 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-699f6c78c-n8rhg"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.640352 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.682000 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699f6c78c-n8rhg"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712427 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712457 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712501 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4tf\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-kube-api-access-9s4tf\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712567 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712598 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712636 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712657 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712679 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.712721 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.787844 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.788381 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.788475 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.800322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.828208 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.832116 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"d0b0e4c3-5f9a-4415-9440-f2758780999a","Type":"ContainerStarted","Data":"3f321230ac8973d08651ef5ec007be3e363fb1f9240503348796de056e19a768"} Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.832290 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.832364 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.832829 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.862241 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4tf\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-kube-api-access-9s4tf\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.899414 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"dcc01311-198b-436e-b438-5229352baf03","Type":"ContainerStarted","Data":"63778119c816b07d5a90abcb435cdeb5d31e8d2c61bbd6168ebd669372cb5549"} Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.913348 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.913392 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8f091aa0a73504e8de1868a249a622c7714787e76cf8f7a2cc53e1715cf3d80f/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.915837 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916264 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-trusted-ca-bundle\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916391 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-oauth-config\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916422 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-oauth-serving-cert\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916450 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-config\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916656 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pl8m\" (UniqueName: \"kubernetes.io/projected/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-kube-api-access-8pl8m\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916690 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-service-ca\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:37 crc kubenswrapper[4956]: I0314 09:16:37.916713 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-serving-cert\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024386 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-oauth-config\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024431 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-oauth-serving-cert\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024459 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-config\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024510 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pl8m\" (UniqueName: \"kubernetes.io/projected/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-kube-api-access-8pl8m\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024535 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-service-ca\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024558 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-serving-cert\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.024588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-trusted-ca-bundle\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.025637 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-config\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.026475 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-service-ca\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.026729 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-oauth-serving-cert\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.041116 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-oauth-config\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.045740 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-console-serving-cert\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.050550 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-trusted-ca-bundle\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.094812 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pl8m\" (UniqueName: \"kubernetes.io/projected/8c4a5b86-919c-4db1-bb99-c3f1699bb9a9-kube-api-access-8pl8m\") pod \"console-699f6c78c-n8rhg\" (UID: \"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9\") " pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.133280 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.302161 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.331801 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.557024 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk"] Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.927635 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" event={"ID":"0be4e37c-1708-4556-9ac0-e6daaf8fdadf","Type":"ContainerStarted","Data":"2aa13a89bab42a15ffa721c463f2ca9110caaaba989883aa1d4cf9d5c740b323"} Mar 14 09:16:38 crc kubenswrapper[4956]: I0314 09:16:38.929289 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531","Type":"ContainerStarted","Data":"e4dfc74293e8feb0681c5afd445c2364ef7fcbca7569a70bc3ccff882be943d7"} Mar 14 09:16:39 crc kubenswrapper[4956]: I0314 09:16:39.045384 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699f6c78c-n8rhg"] Mar 14 09:16:39 crc kubenswrapper[4956]: I0314 09:16:39.327579 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:16:39 crc kubenswrapper[4956]: W0314 09:16:39.348286 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7599eb8_703d_4b46_bb20_0ff9b1a9f4a5.slice/crio-60ef719f511e2921c4e7a8c32e1c285e38edb5153cdab0a51dbc4c0e666b0b8b WatchSource:0}: Error finding container 60ef719f511e2921c4e7a8c32e1c285e38edb5153cdab0a51dbc4c0e666b0b8b: Status 404 returned error can't find the container with id 60ef719f511e2921c4e7a8c32e1c285e38edb5153cdab0a51dbc4c0e666b0b8b Mar 14 09:16:39 crc kubenswrapper[4956]: W0314 09:16:39.363684 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c4a5b86_919c_4db1_bb99_c3f1699bb9a9.slice/crio-d01f61e1932cd9980fbb54028efd5fd0439d072d6a213352f34711b583ba59f0 WatchSource:0}: Error finding container d01f61e1932cd9980fbb54028efd5fd0439d072d6a213352f34711b583ba59f0: Status 404 returned error can't find the container with id d01f61e1932cd9980fbb54028efd5fd0439d072d6a213352f34711b583ba59f0 Mar 14 09:16:39 crc kubenswrapper[4956]: I0314 09:16:39.957125 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerStarted","Data":"60ef719f511e2921c4e7a8c32e1c285e38edb5153cdab0a51dbc4c0e666b0b8b"} Mar 14 09:16:39 crc kubenswrapper[4956]: I0314 09:16:39.958462 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699f6c78c-n8rhg" event={"ID":"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9","Type":"ContainerStarted","Data":"d01f61e1932cd9980fbb54028efd5fd0439d072d6a213352f34711b583ba59f0"} Mar 14 09:16:47 crc kubenswrapper[4956]: I0314 09:16:47.022528 4956 scope.go:117] "RemoveContainer" containerID="3f273f4d984fda46ac4df9bbc1556168d89cc3926fa6db11b764ec4fed622546" Mar 14 09:16:51 crc kubenswrapper[4956]: E0314 09:16:51.363225 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 14 09:16:51 crc kubenswrapper[4956]: E0314 09:16:51.364054 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xnmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_watcher-kuttl-default(ca474341-5bf4-4fa5-bc46-56d42c3ccffd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:16:51 crc kubenswrapper[4956]: E0314 09:16:51.365366 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podUID="ca474341-5bf4-4fa5-bc46-56d42c3ccffd" Mar 14 09:16:51 crc kubenswrapper[4956]: E0314 09:16:51.394572 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 14 09:16:51 crc kubenswrapper[4956]: E0314 09:16:51.394834 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn6pt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_watcher-kuttl-default(28cef006-3a3a-464f-b6d0-9faea75b0a9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:16:51 crc kubenswrapper[4956]: E0314 09:16:51.395985 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/rabbitmq-server-0" podUID="28cef006-3a3a-464f-b6d0-9faea75b0a9e" Mar 14 09:16:52 crc kubenswrapper[4956]: E0314 09:16:52.063380 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podUID="ca474341-5bf4-4fa5-bc46-56d42c3ccffd" Mar 14 09:16:52 crc kubenswrapper[4956]: E0314 09:16:52.063751 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="watcher-kuttl-default/rabbitmq-server-0" podUID="28cef006-3a3a-464f-b6d0-9faea75b0a9e" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.533294 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.533701 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9jpj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_watcher-kuttl-default(19027c02-7e5c-484e-ba9f-b9f9e7c4f81e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.534885 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/openstack-galera-0" podUID="19027c02-7e5c-484e-ba9f-b9f9e7c4f81e" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.856535 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.856791 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.856925 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=watcher-kuttl-default],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bzzvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_watcher-kuttl-default(dcc01311-198b-436e-b438-5229352baf03): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 09:16:53 crc kubenswrapper[4956]: E0314 09:16:53.858414 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="dcc01311-198b-436e-b438-5229352baf03" Mar 14 09:16:54 crc kubenswrapper[4956]: I0314 09:16:54.102586 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699f6c78c-n8rhg" event={"ID":"8c4a5b86-919c-4db1-bb99-c3f1699bb9a9","Type":"ContainerStarted","Data":"3aeef1faf04033c9c0dab4a0b697f1534694960d333690b0f8f3f8d835f4df04"} Mar 14 09:16:54 crc kubenswrapper[4956]: E0314 09:16:54.104214 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="dcc01311-198b-436e-b438-5229352baf03" Mar 14 09:16:54 crc kubenswrapper[4956]: E0314 09:16:54.104459 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="watcher-kuttl-default/openstack-galera-0" podUID="19027c02-7e5c-484e-ba9f-b9f9e7c4f81e" Mar 14 09:16:54 crc kubenswrapper[4956]: I0314 09:16:54.143022 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699f6c78c-n8rhg" podStartSLOduration=17.143002623 podStartE2EDuration="17.143002623s" podCreationTimestamp="2026-03-14 09:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:16:54.139038555 +0000 UTC m=+1219.651730823" watchObservedRunningTime="2026-03-14 09:16:54.143002623 +0000 UTC m=+1219.655694891" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.113233 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"d0b0e4c3-5f9a-4415-9440-f2758780999a","Type":"ContainerStarted","Data":"40c00dcf4fdbf2fbc139da1d3144ae367b399901fa005550f10307fd6f6a5d41"} Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.113705 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.116246 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" event={"ID":"0be4e37c-1708-4556-9ac0-e6daaf8fdadf","Type":"ContainerStarted","Data":"6b330cce4f85a87c9833be39337c18f88a068f32704290605bfacca96178ec15"} Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.143266 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=3.093362727 podStartE2EDuration="20.143246842s" podCreationTimestamp="2026-03-14 09:16:35 +0000 UTC" firstStartedPulling="2026-03-14 09:16:36.820867771 +0000 UTC m=+1202.333560029" lastFinishedPulling="2026-03-14 09:16:53.870751876 +0000 UTC m=+1219.383444144" observedRunningTime="2026-03-14 09:16:55.131640443 +0000 UTC m=+1220.644332711" watchObservedRunningTime="2026-03-14 09:16:55.143246842 +0000 UTC m=+1220.655939130" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.152570 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-hkwjk" podStartSLOduration=10.8871245 podStartE2EDuration="18.152548774s" podCreationTimestamp="2026-03-14 09:16:37 +0000 UTC" firstStartedPulling="2026-03-14 09:16:38.74865534 +0000 UTC m=+1204.261347608" lastFinishedPulling="2026-03-14 09:16:46.014079614 +0000 UTC m=+1211.526771882" observedRunningTime="2026-03-14 09:16:55.150666307 +0000 UTC m=+1220.663358585" watchObservedRunningTime="2026-03-14 09:16:55.152548774 +0000 UTC m=+1220.665241032" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.424242 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.424776 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.424990 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.426421 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"627cfd99357e3b66570faed20a1ce7ae2cc7c510054d839a6cb159952f60d6be"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:16:55 crc kubenswrapper[4956]: I0314 09:16:55.426866 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://627cfd99357e3b66570faed20a1ce7ae2cc7c510054d839a6cb159952f60d6be" gracePeriod=600 Mar 14 09:16:56 crc kubenswrapper[4956]: I0314 09:16:56.124945 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="627cfd99357e3b66570faed20a1ce7ae2cc7c510054d839a6cb159952f60d6be" exitCode=0 Mar 14 09:16:56 crc kubenswrapper[4956]: I0314 09:16:56.125020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"627cfd99357e3b66570faed20a1ce7ae2cc7c510054d839a6cb159952f60d6be"} Mar 14 09:16:56 crc kubenswrapper[4956]: I0314 09:16:56.125392 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"d51ed46fef8fdbd34e0f9aab241d72045408a6fd0c1f2415549d37d1cae23089"} Mar 14 09:16:56 crc kubenswrapper[4956]: I0314 09:16:56.125416 4956 scope.go:117] "RemoveContainer" containerID="c0530da8ab6e9909827338d4f4090fb9eca5500f5a446223956b17c49ce8aec4" Mar 14 09:16:57 crc kubenswrapper[4956]: I0314 09:16:57.134669 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerStarted","Data":"bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9"} Mar 14 09:16:57 crc kubenswrapper[4956]: I0314 09:16:57.136095 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531","Type":"ContainerStarted","Data":"0af7ee67ed00567b97bb8226efd4cb1cc2deb826c00ff005d1f4c6bceb356ae1"} Mar 14 09:16:58 crc kubenswrapper[4956]: I0314 09:16:58.303081 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:58 crc kubenswrapper[4956]: I0314 09:16:58.303184 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:58 crc kubenswrapper[4956]: I0314 09:16:58.308987 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:59 crc kubenswrapper[4956]: I0314 09:16:59.153016 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-699f6c78c-n8rhg" Mar 14 09:16:59 crc kubenswrapper[4956]: I0314 09:16:59.234646 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-757d47dbcd-vwrnb"] Mar 14 09:17:01 crc kubenswrapper[4956]: I0314 09:17:01.041707 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.193955 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"dcc01311-198b-436e-b438-5229352baf03","Type":"ContainerStarted","Data":"8e987a2758a809eda8aab5c2e246171d528b2215024d0907ad0e630edb027c5a"} Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.195261 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.195641 4956 generic.go:334] "Generic (PLEG): container finished" podID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerID="bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9" exitCode=0 Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.195740 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerDied","Data":"bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9"} Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.198299 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531" containerID="0af7ee67ed00567b97bb8226efd4cb1cc2deb826c00ff005d1f4c6bceb356ae1" exitCode=0 Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.198342 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531","Type":"ContainerDied","Data":"0af7ee67ed00567b97bb8226efd4cb1cc2deb826c00ff005d1f4c6bceb356ae1"} Mar 14 09:17:05 crc kubenswrapper[4956]: I0314 09:17:05.217799 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=1.467777404 podStartE2EDuration="29.217773444s" podCreationTimestamp="2026-03-14 09:16:36 +0000 UTC" firstStartedPulling="2026-03-14 09:16:37.020206403 +0000 UTC m=+1202.532898671" lastFinishedPulling="2026-03-14 09:17:04.770202443 +0000 UTC m=+1230.282894711" observedRunningTime="2026-03-14 09:17:05.214037131 +0000 UTC m=+1230.726729399" watchObservedRunningTime="2026-03-14 09:17:05.217773444 +0000 UTC m=+1230.730465722" Mar 14 09:17:06 crc kubenswrapper[4956]: I0314 09:17:06.207799 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"28cef006-3a3a-464f-b6d0-9faea75b0a9e","Type":"ContainerStarted","Data":"28f96e3df998fa6573d1e6ebce498ebc4ffd44a5aa68594d0823346f578855b8"} Mar 14 09:17:06 crc kubenswrapper[4956]: I0314 09:17:06.210590 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"ca474341-5bf4-4fa5-bc46-56d42c3ccffd","Type":"ContainerStarted","Data":"5d1375ba14e52f131a635ea6a35fa962b8488b45ae5a84415cf074cd3ff2474d"} Mar 14 09:17:08 crc kubenswrapper[4956]: I0314 09:17:08.232093 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531","Type":"ContainerStarted","Data":"4fc76e404a52ee110407ace02576590e62f402e888d667d41d06ba76cca93d66"} Mar 14 09:17:14 crc kubenswrapper[4956]: I0314 09:17:14.290162 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531","Type":"ContainerStarted","Data":"385c1565f5aa12b0943413560cb03a89c59cf819a160cc9c60e4354993c5953a"} Mar 14 09:17:14 crc kubenswrapper[4956]: I0314 09:17:14.290681 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:17:14 crc kubenswrapper[4956]: I0314 09:17:14.293916 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Mar 14 09:17:14 crc kubenswrapper[4956]: I0314 09:17:14.314942 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/alertmanager-metric-storage-0" podStartSLOduration=8.919354163 podStartE2EDuration="38.314922316s" podCreationTimestamp="2026-03-14 09:16:36 +0000 UTC" firstStartedPulling="2026-03-14 09:16:37.972052549 +0000 UTC m=+1203.484744817" lastFinishedPulling="2026-03-14 09:17:07.367620702 +0000 UTC m=+1232.880312970" observedRunningTime="2026-03-14 09:17:14.310878915 +0000 UTC m=+1239.823571193" watchObservedRunningTime="2026-03-14 09:17:14.314922316 +0000 UTC m=+1239.827614584" Mar 14 09:17:15 crc kubenswrapper[4956]: I0314 09:17:15.300903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerStarted","Data":"75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e"} Mar 14 09:17:15 crc kubenswrapper[4956]: I0314 09:17:15.302573 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e","Type":"ContainerStarted","Data":"7f8b8c12c13da93367fe6c553c8ee27ddbc10797c3703ede1b9364e78cf512a8"} Mar 14 09:17:16 crc kubenswrapper[4956]: I0314 09:17:16.435596 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:17:18 crc kubenswrapper[4956]: I0314 09:17:18.332664 4956 generic.go:334] "Generic (PLEG): container finished" podID="19027c02-7e5c-484e-ba9f-b9f9e7c4f81e" containerID="7f8b8c12c13da93367fe6c553c8ee27ddbc10797c3703ede1b9364e78cf512a8" exitCode=0 Mar 14 09:17:18 crc kubenswrapper[4956]: I0314 09:17:18.332801 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e","Type":"ContainerDied","Data":"7f8b8c12c13da93367fe6c553c8ee27ddbc10797c3703ede1b9364e78cf512a8"} Mar 14 09:17:18 crc kubenswrapper[4956]: I0314 09:17:18.337431 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerStarted","Data":"e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac"} Mar 14 09:17:19 crc kubenswrapper[4956]: I0314 09:17:19.346785 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"19027c02-7e5c-484e-ba9f-b9f9e7c4f81e","Type":"ContainerStarted","Data":"44fbefaf2379054ab1054dcfe868f15da08011401ab62bb458a2d1cb9e65809d"} Mar 14 09:17:19 crc kubenswrapper[4956]: I0314 09:17:19.376690 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstack-galera-0" podStartSLOduration=7.523777814 podStartE2EDuration="45.376668762s" podCreationTimestamp="2026-03-14 09:16:34 +0000 UTC" firstStartedPulling="2026-03-14 09:16:36.317687645 +0000 UTC m=+1201.830379923" lastFinishedPulling="2026-03-14 09:17:14.170578603 +0000 UTC m=+1239.683270871" observedRunningTime="2026-03-14 09:17:19.369426692 +0000 UTC m=+1244.882118960" watchObservedRunningTime="2026-03-14 09:17:19.376668762 +0000 UTC m=+1244.889361030" Mar 14 09:17:20 crc kubenswrapper[4956]: I0314 09:17:20.356471 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerStarted","Data":"1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862"} Mar 14 09:17:20 crc kubenswrapper[4956]: I0314 09:17:20.384884 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=3.810871853 podStartE2EDuration="44.38486529s" podCreationTimestamp="2026-03-14 09:16:36 +0000 UTC" firstStartedPulling="2026-03-14 09:16:39.361755223 +0000 UTC m=+1204.874447491" lastFinishedPulling="2026-03-14 09:17:19.93574866 +0000 UTC m=+1245.448440928" observedRunningTime="2026-03-14 09:17:20.379990929 +0000 UTC m=+1245.892683197" watchObservedRunningTime="2026-03-14 09:17:20.38486529 +0000 UTC m=+1245.897557558" Mar 14 09:17:23 crc kubenswrapper[4956]: I0314 09:17:23.332839 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:23 crc kubenswrapper[4956]: I0314 09:17:23.332902 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:23 crc kubenswrapper[4956]: I0314 09:17:23.336239 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:23 crc kubenswrapper[4956]: I0314 09:17:23.379680 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.282908 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-757d47dbcd-vwrnb" podUID="d97bedae-1c06-4fac-8861-a44f69574365" containerName="console" containerID="cri-o://1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217" gracePeriod=15 Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.676909 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-757d47dbcd-vwrnb_d97bedae-1c06-4fac-8861-a44f69574365/console/0.log" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.677357 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.715999 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-oauth-serving-cert\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.716041 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9dc\" (UniqueName: \"kubernetes.io/projected/d97bedae-1c06-4fac-8861-a44f69574365-kube-api-access-nw9dc\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.716107 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-trusted-ca-bundle\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.716177 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-console-config\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.716236 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-oauth-config\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.716255 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-service-ca\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.716312 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-serving-cert\") pod \"d97bedae-1c06-4fac-8861-a44f69574365\" (UID: \"d97bedae-1c06-4fac-8861-a44f69574365\") " Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717041 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717074 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-service-ca" (OuterVolumeSpecName: "service-ca") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717142 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-console-config" (OuterVolumeSpecName: "console-config") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717217 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717507 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717528 4956 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717543 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.717554 4956 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d97bedae-1c06-4fac-8861-a44f69574365-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.721172 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.721649 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97bedae-1c06-4fac-8861-a44f69574365-kube-api-access-nw9dc" (OuterVolumeSpecName: "kube-api-access-nw9dc") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "kube-api-access-nw9dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.724387 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d97bedae-1c06-4fac-8861-a44f69574365" (UID: "d97bedae-1c06-4fac-8861-a44f69574365"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.818984 4956 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.819035 4956 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d97bedae-1c06-4fac-8861-a44f69574365-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:24 crc kubenswrapper[4956]: I0314 09:17:24.819049 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9dc\" (UniqueName: \"kubernetes.io/projected/d97bedae-1c06-4fac-8861-a44f69574365-kube-api-access-nw9dc\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.395084 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-757d47dbcd-vwrnb_d97bedae-1c06-4fac-8861-a44f69574365/console/0.log" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.395141 4956 generic.go:334] "Generic (PLEG): container finished" podID="d97bedae-1c06-4fac-8861-a44f69574365" containerID="1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217" exitCode=2 Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.395176 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-757d47dbcd-vwrnb" event={"ID":"d97bedae-1c06-4fac-8861-a44f69574365","Type":"ContainerDied","Data":"1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217"} Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.395209 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-757d47dbcd-vwrnb" event={"ID":"d97bedae-1c06-4fac-8861-a44f69574365","Type":"ContainerDied","Data":"195f679bb058cd7bb2aad41a5684358fea7d4153c9bc296c87677c048b642b2e"} Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.395227 4956 scope.go:117] "RemoveContainer" containerID="1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.395255 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-757d47dbcd-vwrnb" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.423560 4956 scope.go:117] "RemoveContainer" containerID="1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217" Mar 14 09:17:25 crc kubenswrapper[4956]: E0314 09:17:25.424290 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217\": container with ID starting with 1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217 not found: ID does not exist" containerID="1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.424338 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217"} err="failed to get container status \"1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217\": rpc error: code = NotFound desc = could not find container \"1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217\": container with ID starting with 1581cbc4cbc458d627abd4427d8dc4949bae0fefff19cfec1b374cea5c73d217 not found: ID does not exist" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.426605 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-757d47dbcd-vwrnb"] Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.436299 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-757d47dbcd-vwrnb"] Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.603871 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.603934 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:17:25 crc kubenswrapper[4956]: I0314 09:17:25.926919 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.122237 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.122673 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="prometheus" containerID="cri-o://75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e" gracePeriod=600 Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.122701 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="thanos-sidecar" containerID="cri-o://1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862" gracePeriod=600 Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.122755 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="config-reloader" containerID="cri-o://e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac" gracePeriod=600 Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.407439 4956 generic.go:334] "Generic (PLEG): container finished" podID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerID="1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862" exitCode=0 Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.407769 4956 generic.go:334] "Generic (PLEG): container finished" podID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerID="75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e" exitCode=0 Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.407531 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerDied","Data":"1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862"} Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.407891 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerDied","Data":"75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e"} Mar 14 09:17:26 crc kubenswrapper[4956]: I0314 09:17:26.491234 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/openstack-galera-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.142670 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.219612 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97bedae-1c06-4fac-8861-a44f69574365" path="/var/lib/kubelet/pods/d97bedae-1c06-4fac-8861-a44f69574365/volumes" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259422 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-tls-assets\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259471 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-0\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259543 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-1\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259696 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259725 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-thanos-prometheus-http-client-file\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259800 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259826 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config-out\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259853 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-2\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259898 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-web-config\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259920 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s4tf\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-kube-api-access-9s4tf\") pod \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\" (UID: \"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5\") " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259968 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.259987 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.260747 4956 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.260754 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.260772 4956 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.264761 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config-out" (OuterVolumeSpecName: "config-out") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.264784 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.264869 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-kube-api-access-9s4tf" (OuterVolumeSpecName: "kube-api-access-9s4tf") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "kube-api-access-9s4tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.264934 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.265199 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config" (OuterVolumeSpecName: "config") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.284273 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "pvc-ab81db53-2810-4c4c-8f38-945d27ae3199". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.292575 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-web-config" (OuterVolumeSpecName: "web-config") pod "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" (UID: "d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.362919 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") on node \"crc\" " Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.362976 4956 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.362991 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.363004 4956 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-config-out\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.363018 4956 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.363034 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s4tf\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-kube-api-access-9s4tf\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.363047 4956 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-web-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.363058 4956 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.380497 4956 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.380681 4956 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ab81db53-2810-4c4c-8f38-945d27ae3199" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199") on node "crc" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.418189 4956 generic.go:334] "Generic (PLEG): container finished" podID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerID="e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac" exitCode=0 Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.418300 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.418294 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerDied","Data":"e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac"} Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.418426 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5","Type":"ContainerDied","Data":"60ef719f511e2921c4e7a8c32e1c285e38edb5153cdab0a51dbc4c0e666b0b8b"} Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.418453 4956 scope.go:117] "RemoveContainer" containerID="1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.442155 4956 scope.go:117] "RemoveContainer" containerID="e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.456681 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.463638 4956 scope.go:117] "RemoveContainer" containerID="75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.464462 4956 reconciler_common.go:293] "Volume detached for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.465333 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.492186 4956 scope.go:117] "RemoveContainer" containerID="bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.494410 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.494813 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="prometheus" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.494830 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="prometheus" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.494856 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97bedae-1c06-4fac-8861-a44f69574365" containerName="console" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.494863 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97bedae-1c06-4fac-8861-a44f69574365" containerName="console" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.494905 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="init-config-reloader" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.494914 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="init-config-reloader" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.494928 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="config-reloader" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.494936 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="config-reloader" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.494955 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="thanos-sidecar" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.494962 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="thanos-sidecar" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.495127 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97bedae-1c06-4fac-8861-a44f69574365" containerName="console" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.495145 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="thanos-sidecar" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.495156 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="prometheus" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.495166 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" containerName="config-reloader" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.497572 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.500046 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-1" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.500135 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-metric-storage-prometheus-svc" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.500310 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.500347 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-5tqbw" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.500536 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.500549 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-2" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.501090 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.501794 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.522682 4956 scope.go:117] "RemoveContainer" containerID="1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.523161 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862\": container with ID starting with 1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862 not found: ID does not exist" containerID="1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.523197 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862"} err="failed to get container status \"1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862\": rpc error: code = NotFound desc = could not find container \"1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862\": container with ID starting with 1d3f9b5f71d358958ba458a76f43aaa12b51b07a850a8553c501dae22c635862 not found: ID does not exist" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.523226 4956 scope.go:117] "RemoveContainer" containerID="e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.524705 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.533645 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac\": container with ID starting with e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac not found: ID does not exist" containerID="e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.533696 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac"} err="failed to get container status \"e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac\": rpc error: code = NotFound desc = could not find container \"e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac\": container with ID starting with e744ed04de33b1074edf6014e8c1f2f3da5591ef2eb3e60175e67a84261055ac not found: ID does not exist" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.533728 4956 scope.go:117] "RemoveContainer" containerID="75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.534141 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e\": container with ID starting with 75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e not found: ID does not exist" containerID="75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.534162 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e"} err="failed to get container status \"75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e\": rpc error: code = NotFound desc = could not find container \"75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e\": container with ID starting with 75b3bc4ca1c53f2a49216849c45b739905013004ce0d661bdba9621f6a8ccb5e not found: ID does not exist" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.534180 4956 scope.go:117] "RemoveContainer" containerID="bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9" Mar 14 09:17:27 crc kubenswrapper[4956]: E0314 09:17:27.534410 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9\": container with ID starting with bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9 not found: ID does not exist" containerID="bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.534434 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9"} err="failed to get container status \"bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9\": rpc error: code = NotFound desc = could not find container \"bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9\": container with ID starting with bd3c1562670a6088755fef60e78efa87b01e88c6882845ca558c0b64a107adb9 not found: ID does not exist" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.548920 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667378 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667431 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667456 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51fad570-b371-4486-9a35-bea145391f8f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667552 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr468\" (UniqueName: \"kubernetes.io/projected/51fad570-b371-4486-9a35-bea145391f8f-kube-api-access-hr468\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667595 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667623 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-config\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667640 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667658 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51fad570-b371-4486-9a35-bea145391f8f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667679 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667708 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667738 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.667770 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768802 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768860 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-config\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768886 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768907 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51fad570-b371-4486-9a35-bea145391f8f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768930 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768950 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768969 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.768994 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.769025 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.769041 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.769061 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51fad570-b371-4486-9a35-bea145391f8f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.769107 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.769122 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr468\" (UniqueName: \"kubernetes.io/projected/51fad570-b371-4486-9a35-bea145391f8f-kube-api-access-hr468\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.769979 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.770049 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.770654 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51fad570-b371-4486-9a35-bea145391f8f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.774099 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51fad570-b371-4486-9a35-bea145391f8f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.774208 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.774239 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-config\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.774630 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.774814 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.775098 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.777335 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/51fad570-b371-4486-9a35-bea145391f8f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.777544 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.777584 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8f091aa0a73504e8de1868a249a622c7714787e76cf8f7a2cc53e1715cf3d80f/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.778793 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51fad570-b371-4486-9a35-bea145391f8f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.795879 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr468\" (UniqueName: \"kubernetes.io/projected/51fad570-b371-4486-9a35-bea145391f8f-kube-api-access-hr468\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.807996 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab81db53-2810-4c4c-8f38-945d27ae3199\") pod \"prometheus-metric-storage-0\" (UID: \"51fad570-b371-4486-9a35-bea145391f8f\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:27 crc kubenswrapper[4956]: I0314 09:17:27.861985 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:28 crc kubenswrapper[4956]: I0314 09:17:28.286553 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Mar 14 09:17:28 crc kubenswrapper[4956]: W0314 09:17:28.292600 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51fad570_b371_4486_9a35_bea145391f8f.slice/crio-2a3a7b7f1423a0ad88a0fe53c97d2f20b711feb7cdce3f938d158dbe6127050d WatchSource:0}: Error finding container 2a3a7b7f1423a0ad88a0fe53c97d2f20b711feb7cdce3f938d158dbe6127050d: Status 404 returned error can't find the container with id 2a3a7b7f1423a0ad88a0fe53c97d2f20b711feb7cdce3f938d158dbe6127050d Mar 14 09:17:28 crc kubenswrapper[4956]: I0314 09:17:28.428329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"51fad570-b371-4486-9a35-bea145391f8f","Type":"ContainerStarted","Data":"2a3a7b7f1423a0ad88a0fe53c97d2f20b711feb7cdce3f938d158dbe6127050d"} Mar 14 09:17:29 crc kubenswrapper[4956]: I0314 09:17:29.231915 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5" path="/var/lib/kubelet/pods/d7599eb8-703d-4b46-bb20-0ff9b1a9f4a5/volumes" Mar 14 09:17:31 crc kubenswrapper[4956]: I0314 09:17:31.454813 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"51fad570-b371-4486-9a35-bea145391f8f","Type":"ContainerStarted","Data":"55457f5dd1bb3325f5ece3a416d5409ace52570fed00065af8819aed61915695"} Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.312192 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/root-account-create-update-tj699"] Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.313817 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.319290 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-mariadb-root-db-secret" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.327034 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/root-account-create-update-tj699"] Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.370607 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms29x\" (UniqueName: \"kubernetes.io/projected/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-kube-api-access-ms29x\") pod \"root-account-create-update-tj699\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.370684 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-operator-scripts\") pod \"root-account-create-update-tj699\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.472722 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-operator-scripts\") pod \"root-account-create-update-tj699\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.473233 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms29x\" (UniqueName: \"kubernetes.io/projected/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-kube-api-access-ms29x\") pod \"root-account-create-update-tj699\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.473513 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-operator-scripts\") pod \"root-account-create-update-tj699\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.497544 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms29x\" (UniqueName: \"kubernetes.io/projected/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-kube-api-access-ms29x\") pod \"root-account-create-update-tj699\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:34 crc kubenswrapper[4956]: I0314 09:17:34.637138 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.527073 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-create-f89ks"] Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.529647 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.536507 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-f89ks"] Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.574242 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/root-account-create-update-tj699"] Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.589381 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-mariadb-root-db-secret" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.591051 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x876w\" (UniqueName: \"kubernetes.io/projected/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-kube-api-access-x876w\") pod \"keystone-db-create-f89ks\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.591137 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-operator-scripts\") pod \"keystone-db-create-f89ks\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.647554 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz"] Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.649412 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.652334 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-db-secret" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.663134 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz"] Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.692545 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x876w\" (UniqueName: \"kubernetes.io/projected/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-kube-api-access-x876w\") pod \"keystone-db-create-f89ks\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.692642 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-operator-scripts\") pod \"keystone-db-create-f89ks\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.693498 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-operator-scripts\") pod \"keystone-db-create-f89ks\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.716605 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x876w\" (UniqueName: \"kubernetes.io/projected/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-kube-api-access-x876w\") pod \"keystone-db-create-f89ks\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.793915 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d63538-b0d7-4e51-9b51-8c3855488802-operator-scripts\") pod \"keystone-ee8a-account-create-update-gqmgz\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.794083 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchjx\" (UniqueName: \"kubernetes.io/projected/44d63538-b0d7-4e51-9b51-8c3855488802-kube-api-access-hchjx\") pod \"keystone-ee8a-account-create-update-gqmgz\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.859653 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.895132 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchjx\" (UniqueName: \"kubernetes.io/projected/44d63538-b0d7-4e51-9b51-8c3855488802-kube-api-access-hchjx\") pod \"keystone-ee8a-account-create-update-gqmgz\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.895209 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d63538-b0d7-4e51-9b51-8c3855488802-operator-scripts\") pod \"keystone-ee8a-account-create-update-gqmgz\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.896125 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d63538-b0d7-4e51-9b51-8c3855488802-operator-scripts\") pod \"keystone-ee8a-account-create-update-gqmgz\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:35 crc kubenswrapper[4956]: I0314 09:17:35.916176 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchjx\" (UniqueName: \"kubernetes.io/projected/44d63538-b0d7-4e51-9b51-8c3855488802-kube-api-access-hchjx\") pod \"keystone-ee8a-account-create-update-gqmgz\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.009917 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.283620 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-f89ks"] Mar 14 09:17:36 crc kubenswrapper[4956]: W0314 09:17:36.342347 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8dd1e13_b758_484a_9310_0fbd88fdd7ca.slice/crio-c63ba60dc7c00360becf41a898f0aa2e3d5f82accc57821a22d7e2842a8a2eb3 WatchSource:0}: Error finding container c63ba60dc7c00360becf41a898f0aa2e3d5f82accc57821a22d7e2842a8a2eb3: Status 404 returned error can't find the container with id c63ba60dc7c00360becf41a898f0aa2e3d5f82accc57821a22d7e2842a8a2eb3 Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.536683 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/root-account-create-update-tj699" event={"ID":"c370ae95-6bb5-4a51-88ef-d0ce6811faa3","Type":"ContainerDied","Data":"a60df08c4fee37c3b526cb8696446bb6aacb51461f984650f76a65bd96b9d973"} Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.536641 4956 generic.go:334] "Generic (PLEG): container finished" podID="c370ae95-6bb5-4a51-88ef-d0ce6811faa3" containerID="a60df08c4fee37c3b526cb8696446bb6aacb51461f984650f76a65bd96b9d973" exitCode=0 Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.537690 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/root-account-create-update-tj699" event={"ID":"c370ae95-6bb5-4a51-88ef-d0ce6811faa3","Type":"ContainerStarted","Data":"42b4090dc1139e527e6bb5e0836e72eceec73c9efce2ef883c1f82c51673edd9"} Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.540321 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-f89ks" event={"ID":"d8dd1e13-b758-484a-9310-0fbd88fdd7ca","Type":"ContainerStarted","Data":"fa2bcc70198965d972eb6a2ee18e6c80aa308cdafc9b18fb7fb6003c6cb14ee9"} Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.540402 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-f89ks" event={"ID":"d8dd1e13-b758-484a-9310-0fbd88fdd7ca","Type":"ContainerStarted","Data":"c63ba60dc7c00360becf41a898f0aa2e3d5f82accc57821a22d7e2842a8a2eb3"} Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.576204 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-create-f89ks" podStartSLOduration=1.576184832 podStartE2EDuration="1.576184832s" podCreationTimestamp="2026-03-14 09:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:17:36.573971627 +0000 UTC m=+1262.086663905" watchObservedRunningTime="2026-03-14 09:17:36.576184832 +0000 UTC m=+1262.088877090" Mar 14 09:17:36 crc kubenswrapper[4956]: I0314 09:17:36.675564 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz"] Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.551798 4956 generic.go:334] "Generic (PLEG): container finished" podID="d8dd1e13-b758-484a-9310-0fbd88fdd7ca" containerID="fa2bcc70198965d972eb6a2ee18e6c80aa308cdafc9b18fb7fb6003c6cb14ee9" exitCode=0 Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.551876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-f89ks" event={"ID":"d8dd1e13-b758-484a-9310-0fbd88fdd7ca","Type":"ContainerDied","Data":"fa2bcc70198965d972eb6a2ee18e6c80aa308cdafc9b18fb7fb6003c6cb14ee9"} Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.553933 4956 generic.go:334] "Generic (PLEG): container finished" podID="44d63538-b0d7-4e51-9b51-8c3855488802" containerID="54e3e91deedba475cc5c77f2aa4db0c5b6cfa2ef820109ee65a47f06539fc0c0" exitCode=0 Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.554008 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" event={"ID":"44d63538-b0d7-4e51-9b51-8c3855488802","Type":"ContainerDied","Data":"54e3e91deedba475cc5c77f2aa4db0c5b6cfa2ef820109ee65a47f06539fc0c0"} Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.554071 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" event={"ID":"44d63538-b0d7-4e51-9b51-8c3855488802","Type":"ContainerStarted","Data":"3dea61804f5372b3b2c44e305c0f41e0b1a7e8f2cc1d9d2086b1893da4da07f9"} Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.840310 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.939712 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms29x\" (UniqueName: \"kubernetes.io/projected/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-kube-api-access-ms29x\") pod \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.939845 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-operator-scripts\") pod \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\" (UID: \"c370ae95-6bb5-4a51-88ef-d0ce6811faa3\") " Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.940617 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c370ae95-6bb5-4a51-88ef-d0ce6811faa3" (UID: "c370ae95-6bb5-4a51-88ef-d0ce6811faa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:37 crc kubenswrapper[4956]: I0314 09:17:37.948910 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-kube-api-access-ms29x" (OuterVolumeSpecName: "kube-api-access-ms29x") pod "c370ae95-6bb5-4a51-88ef-d0ce6811faa3" (UID: "c370ae95-6bb5-4a51-88ef-d0ce6811faa3"). InnerVolumeSpecName "kube-api-access-ms29x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.042263 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms29x\" (UniqueName: \"kubernetes.io/projected/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-kube-api-access-ms29x\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.042302 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c370ae95-6bb5-4a51-88ef-d0ce6811faa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.566028 4956 generic.go:334] "Generic (PLEG): container finished" podID="ca474341-5bf4-4fa5-bc46-56d42c3ccffd" containerID="5d1375ba14e52f131a635ea6a35fa962b8488b45ae5a84415cf074cd3ff2474d" exitCode=0 Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.566124 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"ca474341-5bf4-4fa5-bc46-56d42c3ccffd","Type":"ContainerDied","Data":"5d1375ba14e52f131a635ea6a35fa962b8488b45ae5a84415cf074cd3ff2474d"} Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.571445 4956 generic.go:334] "Generic (PLEG): container finished" podID="28cef006-3a3a-464f-b6d0-9faea75b0a9e" containerID="28f96e3df998fa6573d1e6ebce498ebc4ffd44a5aa68594d0823346f578855b8" exitCode=0 Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.571540 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"28cef006-3a3a-464f-b6d0-9faea75b0a9e","Type":"ContainerDied","Data":"28f96e3df998fa6573d1e6ebce498ebc4ffd44a5aa68594d0823346f578855b8"} Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.574661 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/root-account-create-update-tj699" event={"ID":"c370ae95-6bb5-4a51-88ef-d0ce6811faa3","Type":"ContainerDied","Data":"42b4090dc1139e527e6bb5e0836e72eceec73c9efce2ef883c1f82c51673edd9"} Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.574715 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42b4090dc1139e527e6bb5e0836e72eceec73c9efce2ef883c1f82c51673edd9" Mar 14 09:17:38 crc kubenswrapper[4956]: I0314 09:17:38.574956 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/root-account-create-update-tj699" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.534955 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.567844 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchjx\" (UniqueName: \"kubernetes.io/projected/44d63538-b0d7-4e51-9b51-8c3855488802-kube-api-access-hchjx\") pod \"44d63538-b0d7-4e51-9b51-8c3855488802\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.568047 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d63538-b0d7-4e51-9b51-8c3855488802-operator-scripts\") pod \"44d63538-b0d7-4e51-9b51-8c3855488802\" (UID: \"44d63538-b0d7-4e51-9b51-8c3855488802\") " Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.569269 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d63538-b0d7-4e51-9b51-8c3855488802-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44d63538-b0d7-4e51-9b51-8c3855488802" (UID: "44d63538-b0d7-4e51-9b51-8c3855488802"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.583996 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d63538-b0d7-4e51-9b51-8c3855488802-kube-api-access-hchjx" (OuterVolumeSpecName: "kube-api-access-hchjx") pod "44d63538-b0d7-4e51-9b51-8c3855488802" (UID: "44d63538-b0d7-4e51-9b51-8c3855488802"). InnerVolumeSpecName "kube-api-access-hchjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.606010 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"28cef006-3a3a-464f-b6d0-9faea75b0a9e","Type":"ContainerStarted","Data":"c3e0a6d69f4ae32527b488ed76016873514d5eae1d3a7ecc2470b6bf42cc743d"} Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.606625 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.607683 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" event={"ID":"44d63538-b0d7-4e51-9b51-8c3855488802","Type":"ContainerDied","Data":"3dea61804f5372b3b2c44e305c0f41e0b1a7e8f2cc1d9d2086b1893da4da07f9"} Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.607717 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dea61804f5372b3b2c44e305c0f41e0b1a7e8f2cc1d9d2086b1893da4da07f9" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.607771 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.617957 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"ca474341-5bf4-4fa5-bc46-56d42c3ccffd","Type":"ContainerStarted","Data":"260f9543e346b5a94e657ffa239706a54e4809e5982cd317f472bce242668329"} Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.618192 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.640220 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-server-0" podStartSLOduration=37.043659067 podStartE2EDuration="1m7.640204427s" podCreationTimestamp="2026-03-14 09:16:32 +0000 UTC" firstStartedPulling="2026-03-14 09:16:34.097704512 +0000 UTC m=+1199.610396780" lastFinishedPulling="2026-03-14 09:17:04.694249872 +0000 UTC m=+1230.206942140" observedRunningTime="2026-03-14 09:17:39.633764186 +0000 UTC m=+1265.146456464" watchObservedRunningTime="2026-03-14 09:17:39.640204427 +0000 UTC m=+1265.152896695" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.673881 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44d63538-b0d7-4e51-9b51-8c3855488802-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.673932 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchjx\" (UniqueName: \"kubernetes.io/projected/44d63538-b0d7-4e51-9b51-8c3855488802-kube-api-access-hchjx\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.684322 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=37.615652154 podStartE2EDuration="1m7.684300384s" podCreationTimestamp="2026-03-14 09:16:32 +0000 UTC" firstStartedPulling="2026-03-14 09:16:34.752338347 +0000 UTC m=+1200.265030615" lastFinishedPulling="2026-03-14 09:17:04.820986577 +0000 UTC m=+1230.333678845" observedRunningTime="2026-03-14 09:17:39.676334616 +0000 UTC m=+1265.189026894" watchObservedRunningTime="2026-03-14 09:17:39.684300384 +0000 UTC m=+1265.196992652" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.724645 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.886095 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x876w\" (UniqueName: \"kubernetes.io/projected/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-kube-api-access-x876w\") pod \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.886346 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-operator-scripts\") pod \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\" (UID: \"d8dd1e13-b758-484a-9310-0fbd88fdd7ca\") " Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.886733 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8dd1e13-b758-484a-9310-0fbd88fdd7ca" (UID: "d8dd1e13-b758-484a-9310-0fbd88fdd7ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.887016 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.902433 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-kube-api-access-x876w" (OuterVolumeSpecName: "kube-api-access-x876w") pod "d8dd1e13-b758-484a-9310-0fbd88fdd7ca" (UID: "d8dd1e13-b758-484a-9310-0fbd88fdd7ca"). InnerVolumeSpecName "kube-api-access-x876w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:39 crc kubenswrapper[4956]: I0314 09:17:39.988692 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x876w\" (UniqueName: \"kubernetes.io/projected/d8dd1e13-b758-484a-9310-0fbd88fdd7ca-kube-api-access-x876w\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:40 crc kubenswrapper[4956]: I0314 09:17:40.628792 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-f89ks" Mar 14 09:17:40 crc kubenswrapper[4956]: I0314 09:17:40.628790 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-f89ks" event={"ID":"d8dd1e13-b758-484a-9310-0fbd88fdd7ca","Type":"ContainerDied","Data":"c63ba60dc7c00360becf41a898f0aa2e3d5f82accc57821a22d7e2842a8a2eb3"} Mar 14 09:17:40 crc kubenswrapper[4956]: I0314 09:17:40.628951 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c63ba60dc7c00360becf41a898f0aa2e3d5f82accc57821a22d7e2842a8a2eb3" Mar 14 09:17:40 crc kubenswrapper[4956]: I0314 09:17:40.630729 4956 generic.go:334] "Generic (PLEG): container finished" podID="51fad570-b371-4486-9a35-bea145391f8f" containerID="55457f5dd1bb3325f5ece3a416d5409ace52570fed00065af8819aed61915695" exitCode=0 Mar 14 09:17:40 crc kubenswrapper[4956]: I0314 09:17:40.630827 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"51fad570-b371-4486-9a35-bea145391f8f","Type":"ContainerDied","Data":"55457f5dd1bb3325f5ece3a416d5409ace52570fed00065af8819aed61915695"} Mar 14 09:17:41 crc kubenswrapper[4956]: I0314 09:17:41.642421 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"51fad570-b371-4486-9a35-bea145391f8f","Type":"ContainerStarted","Data":"e63963d2c9b63a2cd1fd34ba7767837c6b3c516ead4fbc2621f9d65f11f2b282"} Mar 14 09:17:43 crc kubenswrapper[4956]: I0314 09:17:43.662575 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"51fad570-b371-4486-9a35-bea145391f8f","Type":"ContainerStarted","Data":"299ef98bc1618dd3700def83d7927ab4e5d35331f363fe38d4149578fbd72257"} Mar 14 09:17:44 crc kubenswrapper[4956]: I0314 09:17:44.674024 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"51fad570-b371-4486-9a35-bea145391f8f","Type":"ContainerStarted","Data":"7ca0e87db93062cb1f4dfa4f693ba8893954de03bc6d9db8734e7a7cfcbfaab8"} Mar 14 09:17:44 crc kubenswrapper[4956]: I0314 09:17:44.702526 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=17.702506837 podStartE2EDuration="17.702506837s" podCreationTimestamp="2026-03-14 09:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:17:44.696786484 +0000 UTC m=+1270.209478752" watchObservedRunningTime="2026-03-14 09:17:44.702506837 +0000 UTC m=+1270.215199105" Mar 14 09:17:47 crc kubenswrapper[4956]: I0314 09:17:47.862576 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:53 crc kubenswrapper[4956]: I0314 09:17:53.667711 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-server-0" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.244765 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fqjr8"] Mar 14 09:17:54 crc kubenswrapper[4956]: E0314 09:17:54.245515 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8dd1e13-b758-484a-9310-0fbd88fdd7ca" containerName="mariadb-database-create" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.245532 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dd1e13-b758-484a-9310-0fbd88fdd7ca" containerName="mariadb-database-create" Mar 14 09:17:54 crc kubenswrapper[4956]: E0314 09:17:54.245549 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c370ae95-6bb5-4a51-88ef-d0ce6811faa3" containerName="mariadb-account-create-update" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.245555 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c370ae95-6bb5-4a51-88ef-d0ce6811faa3" containerName="mariadb-account-create-update" Mar 14 09:17:54 crc kubenswrapper[4956]: E0314 09:17:54.245582 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d63538-b0d7-4e51-9b51-8c3855488802" containerName="mariadb-account-create-update" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.245590 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d63538-b0d7-4e51-9b51-8c3855488802" containerName="mariadb-account-create-update" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.245777 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d63538-b0d7-4e51-9b51-8c3855488802" containerName="mariadb-account-create-update" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.245798 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8dd1e13-b758-484a-9310-0fbd88fdd7ca" containerName="mariadb-database-create" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.245808 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c370ae95-6bb5-4a51-88ef-d0ce6811faa3" containerName="mariadb-account-create-update" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.246360 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.246518 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.249678 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-4pg4m" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.249848 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.249883 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.253968 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.273547 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fqjr8"] Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.321957 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-config-data\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.322113 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-combined-ca-bundle\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.322195 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlmg9\" (UniqueName: \"kubernetes.io/projected/dac0e60a-75b8-410d-b536-8c91aac1873a-kube-api-access-dlmg9\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.423998 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlmg9\" (UniqueName: \"kubernetes.io/projected/dac0e60a-75b8-410d-b536-8c91aac1873a-kube-api-access-dlmg9\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.424109 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-config-data\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.424195 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-combined-ca-bundle\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.430546 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-combined-ca-bundle\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.436612 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-config-data\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.446213 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlmg9\" (UniqueName: \"kubernetes.io/projected/dac0e60a-75b8-410d-b536-8c91aac1873a-kube-api-access-dlmg9\") pod \"keystone-db-sync-fqjr8\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:54 crc kubenswrapper[4956]: I0314 09:17:54.567507 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:17:55 crc kubenswrapper[4956]: I0314 09:17:55.125698 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fqjr8"] Mar 14 09:17:55 crc kubenswrapper[4956]: W0314 09:17:55.139396 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddac0e60a_75b8_410d_b536_8c91aac1873a.slice/crio-61479e6428aad5d53b5906ac08b5de78ac2b0cdd4f9ff386ea440880096ac2f3 WatchSource:0}: Error finding container 61479e6428aad5d53b5906ac08b5de78ac2b0cdd4f9ff386ea440880096ac2f3: Status 404 returned error can't find the container with id 61479e6428aad5d53b5906ac08b5de78ac2b0cdd4f9ff386ea440880096ac2f3 Mar 14 09:17:55 crc kubenswrapper[4956]: I0314 09:17:55.798211 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" event={"ID":"dac0e60a-75b8-410d-b536-8c91aac1873a","Type":"ContainerStarted","Data":"61479e6428aad5d53b5906ac08b5de78ac2b0cdd4f9ff386ea440880096ac2f3"} Mar 14 09:17:57 crc kubenswrapper[4956]: I0314 09:17:57.863206 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:57 crc kubenswrapper[4956]: I0314 09:17:57.872544 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:17:58 crc kubenswrapper[4956]: I0314 09:17:58.834275 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.135362 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557998-xqk25"] Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.136704 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.139354 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.140352 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.141113 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.150307 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-xqk25"] Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.229926 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kz9h\" (UniqueName: \"kubernetes.io/projected/847bf01f-0dfa-424b-ae2f-8dba3e277a5c-kube-api-access-7kz9h\") pod \"auto-csr-approver-29557998-xqk25\" (UID: \"847bf01f-0dfa-424b-ae2f-8dba3e277a5c\") " pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.331563 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kz9h\" (UniqueName: \"kubernetes.io/projected/847bf01f-0dfa-424b-ae2f-8dba3e277a5c-kube-api-access-7kz9h\") pod \"auto-csr-approver-29557998-xqk25\" (UID: \"847bf01f-0dfa-424b-ae2f-8dba3e277a5c\") " pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.353968 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kz9h\" (UniqueName: \"kubernetes.io/projected/847bf01f-0dfa-424b-ae2f-8dba3e277a5c-kube-api-access-7kz9h\") pod \"auto-csr-approver-29557998-xqk25\" (UID: \"847bf01f-0dfa-424b-ae2f-8dba3e277a5c\") " pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:00 crc kubenswrapper[4956]: I0314 09:18:00.462341 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:03 crc kubenswrapper[4956]: I0314 09:18:03.420669 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-xqk25"] Mar 14 09:18:03 crc kubenswrapper[4956]: I0314 09:18:03.870288 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-xqk25" event={"ID":"847bf01f-0dfa-424b-ae2f-8dba3e277a5c","Type":"ContainerStarted","Data":"680ed45e6f717754f507da48d7861b7e234393e8dc6233ab9130868f952caedd"} Mar 14 09:18:03 crc kubenswrapper[4956]: I0314 09:18:03.871955 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" event={"ID":"dac0e60a-75b8-410d-b536-8c91aac1873a","Type":"ContainerStarted","Data":"d7ee5b91e6edffce07e8bbffe9903192a3da277e243b2ad852e691ac9ad013bf"} Mar 14 09:18:03 crc kubenswrapper[4956]: I0314 09:18:03.890382 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" podStartSLOduration=1.781208328 podStartE2EDuration="9.890366045s" podCreationTimestamp="2026-03-14 09:17:54 +0000 UTC" firstStartedPulling="2026-03-14 09:17:55.141317328 +0000 UTC m=+1280.654009586" lastFinishedPulling="2026-03-14 09:18:03.250475035 +0000 UTC m=+1288.763167303" observedRunningTime="2026-03-14 09:18:03.888599411 +0000 UTC m=+1289.401291679" watchObservedRunningTime="2026-03-14 09:18:03.890366045 +0000 UTC m=+1289.403058313" Mar 14 09:18:04 crc kubenswrapper[4956]: I0314 09:18:04.885001 4956 generic.go:334] "Generic (PLEG): container finished" podID="847bf01f-0dfa-424b-ae2f-8dba3e277a5c" containerID="b5e7ad7c58aa5b7bc28a30c20af15a6749969875cb4bf9ad9a0ad6f7e4d12340" exitCode=0 Mar 14 09:18:04 crc kubenswrapper[4956]: I0314 09:18:04.885077 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-xqk25" event={"ID":"847bf01f-0dfa-424b-ae2f-8dba3e277a5c","Type":"ContainerDied","Data":"b5e7ad7c58aa5b7bc28a30c20af15a6749969875cb4bf9ad9a0ad6f7e4d12340"} Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.276418 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.337094 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kz9h\" (UniqueName: \"kubernetes.io/projected/847bf01f-0dfa-424b-ae2f-8dba3e277a5c-kube-api-access-7kz9h\") pod \"847bf01f-0dfa-424b-ae2f-8dba3e277a5c\" (UID: \"847bf01f-0dfa-424b-ae2f-8dba3e277a5c\") " Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.341630 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847bf01f-0dfa-424b-ae2f-8dba3e277a5c-kube-api-access-7kz9h" (OuterVolumeSpecName: "kube-api-access-7kz9h") pod "847bf01f-0dfa-424b-ae2f-8dba3e277a5c" (UID: "847bf01f-0dfa-424b-ae2f-8dba3e277a5c"). InnerVolumeSpecName "kube-api-access-7kz9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.438377 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kz9h\" (UniqueName: \"kubernetes.io/projected/847bf01f-0dfa-424b-ae2f-8dba3e277a5c-kube-api-access-7kz9h\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.907189 4956 generic.go:334] "Generic (PLEG): container finished" podID="dac0e60a-75b8-410d-b536-8c91aac1873a" containerID="d7ee5b91e6edffce07e8bbffe9903192a3da277e243b2ad852e691ac9ad013bf" exitCode=0 Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.907283 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" event={"ID":"dac0e60a-75b8-410d-b536-8c91aac1873a","Type":"ContainerDied","Data":"d7ee5b91e6edffce07e8bbffe9903192a3da277e243b2ad852e691ac9ad013bf"} Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.909399 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-xqk25" event={"ID":"847bf01f-0dfa-424b-ae2f-8dba3e277a5c","Type":"ContainerDied","Data":"680ed45e6f717754f507da48d7861b7e234393e8dc6233ab9130868f952caedd"} Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.909435 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-xqk25" Mar 14 09:18:06 crc kubenswrapper[4956]: I0314 09:18:06.909444 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680ed45e6f717754f507da48d7861b7e234393e8dc6233ab9130868f952caedd" Mar 14 09:18:07 crc kubenswrapper[4956]: I0314 09:18:07.344114 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-vcc4s"] Mar 14 09:18:07 crc kubenswrapper[4956]: I0314 09:18:07.352076 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-vcc4s"] Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.190350 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.266387 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-combined-ca-bundle\") pod \"dac0e60a-75b8-410d-b536-8c91aac1873a\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.266562 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlmg9\" (UniqueName: \"kubernetes.io/projected/dac0e60a-75b8-410d-b536-8c91aac1873a-kube-api-access-dlmg9\") pod \"dac0e60a-75b8-410d-b536-8c91aac1873a\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.266599 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-config-data\") pod \"dac0e60a-75b8-410d-b536-8c91aac1873a\" (UID: \"dac0e60a-75b8-410d-b536-8c91aac1873a\") " Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.272925 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac0e60a-75b8-410d-b536-8c91aac1873a-kube-api-access-dlmg9" (OuterVolumeSpecName: "kube-api-access-dlmg9") pod "dac0e60a-75b8-410d-b536-8c91aac1873a" (UID: "dac0e60a-75b8-410d-b536-8c91aac1873a"). InnerVolumeSpecName "kube-api-access-dlmg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.290413 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dac0e60a-75b8-410d-b536-8c91aac1873a" (UID: "dac0e60a-75b8-410d-b536-8c91aac1873a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.306635 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-config-data" (OuterVolumeSpecName: "config-data") pod "dac0e60a-75b8-410d-b536-8c91aac1873a" (UID: "dac0e60a-75b8-410d-b536-8c91aac1873a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.368012 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.368066 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlmg9\" (UniqueName: \"kubernetes.io/projected/dac0e60a-75b8-410d-b536-8c91aac1873a-kube-api-access-dlmg9\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.368083 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e60a-75b8-410d-b536-8c91aac1873a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.939964 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" event={"ID":"dac0e60a-75b8-410d-b536-8c91aac1873a","Type":"ContainerDied","Data":"61479e6428aad5d53b5906ac08b5de78ac2b0cdd4f9ff386ea440880096ac2f3"} Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.940276 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61479e6428aad5d53b5906ac08b5de78ac2b0cdd4f9ff386ea440880096ac2f3" Mar 14 09:18:08 crc kubenswrapper[4956]: I0314 09:18:08.940064 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fqjr8" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.140776 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-w2xvg"] Mar 14 09:18:09 crc kubenswrapper[4956]: E0314 09:18:09.141089 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847bf01f-0dfa-424b-ae2f-8dba3e277a5c" containerName="oc" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.141109 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="847bf01f-0dfa-424b-ae2f-8dba3e277a5c" containerName="oc" Mar 14 09:18:09 crc kubenswrapper[4956]: E0314 09:18:09.141134 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac0e60a-75b8-410d-b536-8c91aac1873a" containerName="keystone-db-sync" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.141142 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac0e60a-75b8-410d-b536-8c91aac1873a" containerName="keystone-db-sync" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.141289 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac0e60a-75b8-410d-b536-8c91aac1873a" containerName="keystone-db-sync" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.141304 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="847bf01f-0dfa-424b-ae2f-8dba3e277a5c" containerName="oc" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.141849 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.146228 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.146806 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.146948 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.147413 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-4pg4m" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.147970 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.167780 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-w2xvg"] Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.228711 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3704b7-7a24-482e-b277-c4312f42a29d" path="/var/lib/kubelet/pods/9b3704b7-7a24-482e-b277-c4312f42a29d/volumes" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.282246 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-combined-ca-bundle\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.282752 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-config-data\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.282824 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-fernet-keys\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.282888 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-scripts\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.283078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-credential-keys\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.283199 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2kj\" (UniqueName: \"kubernetes.io/projected/abeb6e18-c927-4a7f-9f12-e129f39d7559-kube-api-access-mq2kj\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.289169 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.290980 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.295862 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.296377 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.306871 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.384937 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-combined-ca-bundle\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.385012 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-config-data\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.385049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-fernet-keys\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.385157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-scripts\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.385202 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-credential-keys\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.385239 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2kj\" (UniqueName: \"kubernetes.io/projected/abeb6e18-c927-4a7f-9f12-e129f39d7559-kube-api-access-mq2kj\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.390609 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-fernet-keys\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.392388 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-config-data\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.392429 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-combined-ca-bundle\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.401827 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-scripts\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.404724 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-credential-keys\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.407280 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2kj\" (UniqueName: \"kubernetes.io/projected/abeb6e18-c927-4a7f-9f12-e129f39d7559-kube-api-access-mq2kj\") pod \"keystone-bootstrap-w2xvg\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.459058 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.487802 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-config-data\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.487896 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-run-httpd\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.487927 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd5kt\" (UniqueName: \"kubernetes.io/projected/0e051689-f994-449c-9cf1-76d7f6b2c78b-kube-api-access-jd5kt\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.487957 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-log-httpd\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.487977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.488012 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-scripts\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.488048 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.588920 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-config-data\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.589389 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-run-httpd\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.589420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd5kt\" (UniqueName: \"kubernetes.io/projected/0e051689-f994-449c-9cf1-76d7f6b2c78b-kube-api-access-jd5kt\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.589451 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-log-httpd\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.589474 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.589530 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-scripts\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.589567 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.591213 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-log-httpd\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.591511 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-run-httpd\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.595163 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.596027 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.599508 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-scripts\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.601153 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-config-data\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.622520 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd5kt\" (UniqueName: \"kubernetes.io/projected/0e051689-f994-449c-9cf1-76d7f6b2c78b-kube-api-access-jd5kt\") pod \"ceilometer-0\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.911940 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.934443 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-w2xvg"] Mar 14 09:18:09 crc kubenswrapper[4956]: W0314 09:18:09.939035 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabeb6e18_c927_4a7f_9f12_e129f39d7559.slice/crio-1d8a47e4a9e283d652f3399738afac3dd971566e4a33f68dce3ae287ab68e2ee WatchSource:0}: Error finding container 1d8a47e4a9e283d652f3399738afac3dd971566e4a33f68dce3ae287ab68e2ee: Status 404 returned error can't find the container with id 1d8a47e4a9e283d652f3399738afac3dd971566e4a33f68dce3ae287ab68e2ee Mar 14 09:18:09 crc kubenswrapper[4956]: I0314 09:18:09.949623 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" event={"ID":"abeb6e18-c927-4a7f-9f12-e129f39d7559","Type":"ContainerStarted","Data":"1d8a47e4a9e283d652f3399738afac3dd971566e4a33f68dce3ae287ab68e2ee"} Mar 14 09:18:10 crc kubenswrapper[4956]: I0314 09:18:10.425777 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:10 crc kubenswrapper[4956]: I0314 09:18:10.973316 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" event={"ID":"abeb6e18-c927-4a7f-9f12-e129f39d7559","Type":"ContainerStarted","Data":"37994017928c556ac062cb21c6afdf9728f5a0725895374937786528ae908194"} Mar 14 09:18:10 crc kubenswrapper[4956]: I0314 09:18:10.974844 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerStarted","Data":"8915743bb15cda60b80e944e11b09ee07f76be2cc215d47ff213ddd90f6a404c"} Mar 14 09:18:10 crc kubenswrapper[4956]: I0314 09:18:10.993733 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" podStartSLOduration=1.993711284 podStartE2EDuration="1.993711284s" podCreationTimestamp="2026-03-14 09:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:18:10.990603726 +0000 UTC m=+1296.503295994" watchObservedRunningTime="2026-03-14 09:18:10.993711284 +0000 UTC m=+1296.506403552" Mar 14 09:18:11 crc kubenswrapper[4956]: I0314 09:18:11.783498 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:17 crc kubenswrapper[4956]: I0314 09:18:17.035180 4956 generic.go:334] "Generic (PLEG): container finished" podID="abeb6e18-c927-4a7f-9f12-e129f39d7559" containerID="37994017928c556ac062cb21c6afdf9728f5a0725895374937786528ae908194" exitCode=0 Mar 14 09:18:17 crc kubenswrapper[4956]: I0314 09:18:17.035268 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" event={"ID":"abeb6e18-c927-4a7f-9f12-e129f39d7559","Type":"ContainerDied","Data":"37994017928c556ac062cb21c6afdf9728f5a0725895374937786528ae908194"} Mar 14 09:18:17 crc kubenswrapper[4956]: I0314 09:18:17.037870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerStarted","Data":"29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea"} Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.744865 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.867735 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2kj\" (UniqueName: \"kubernetes.io/projected/abeb6e18-c927-4a7f-9f12-e129f39d7559-kube-api-access-mq2kj\") pod \"abeb6e18-c927-4a7f-9f12-e129f39d7559\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.867775 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-combined-ca-bundle\") pod \"abeb6e18-c927-4a7f-9f12-e129f39d7559\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.867824 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-scripts\") pod \"abeb6e18-c927-4a7f-9f12-e129f39d7559\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.867924 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-fernet-keys\") pod \"abeb6e18-c927-4a7f-9f12-e129f39d7559\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.867975 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-credential-keys\") pod \"abeb6e18-c927-4a7f-9f12-e129f39d7559\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.868026 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-config-data\") pod \"abeb6e18-c927-4a7f-9f12-e129f39d7559\" (UID: \"abeb6e18-c927-4a7f-9f12-e129f39d7559\") " Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.887301 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "abeb6e18-c927-4a7f-9f12-e129f39d7559" (UID: "abeb6e18-c927-4a7f-9f12-e129f39d7559"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.887401 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abeb6e18-c927-4a7f-9f12-e129f39d7559-kube-api-access-mq2kj" (OuterVolumeSpecName: "kube-api-access-mq2kj") pod "abeb6e18-c927-4a7f-9f12-e129f39d7559" (UID: "abeb6e18-c927-4a7f-9f12-e129f39d7559"). InnerVolumeSpecName "kube-api-access-mq2kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.888514 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-scripts" (OuterVolumeSpecName: "scripts") pod "abeb6e18-c927-4a7f-9f12-e129f39d7559" (UID: "abeb6e18-c927-4a7f-9f12-e129f39d7559"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.888816 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "abeb6e18-c927-4a7f-9f12-e129f39d7559" (UID: "abeb6e18-c927-4a7f-9f12-e129f39d7559"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.892546 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abeb6e18-c927-4a7f-9f12-e129f39d7559" (UID: "abeb6e18-c927-4a7f-9f12-e129f39d7559"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.904732 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-config-data" (OuterVolumeSpecName: "config-data") pod "abeb6e18-c927-4a7f-9f12-e129f39d7559" (UID: "abeb6e18-c927-4a7f-9f12-e129f39d7559"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.969652 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.969689 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.969700 4956 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.969711 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.969724 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2kj\" (UniqueName: \"kubernetes.io/projected/abeb6e18-c927-4a7f-9f12-e129f39d7559-kube-api-access-mq2kj\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:19 crc kubenswrapper[4956]: I0314 09:18:19.969736 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb6e18-c927-4a7f-9f12-e129f39d7559-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:20 crc kubenswrapper[4956]: I0314 09:18:20.066040 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" event={"ID":"abeb6e18-c927-4a7f-9f12-e129f39d7559","Type":"ContainerDied","Data":"1d8a47e4a9e283d652f3399738afac3dd971566e4a33f68dce3ae287ab68e2ee"} Mar 14 09:18:20 crc kubenswrapper[4956]: I0314 09:18:20.066081 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d8a47e4a9e283d652f3399738afac3dd971566e4a33f68dce3ae287ab68e2ee" Mar 14 09:18:20 crc kubenswrapper[4956]: I0314 09:18:20.066141 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-w2xvg" Mar 14 09:18:20 crc kubenswrapper[4956]: I0314 09:18:20.935187 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-w2xvg"] Mar 14 09:18:20 crc kubenswrapper[4956]: I0314 09:18:20.942937 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-w2xvg"] Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.026922 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-4gqxj"] Mar 14 09:18:21 crc kubenswrapper[4956]: E0314 09:18:21.027338 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abeb6e18-c927-4a7f-9f12-e129f39d7559" containerName="keystone-bootstrap" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.027364 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="abeb6e18-c927-4a7f-9f12-e129f39d7559" containerName="keystone-bootstrap" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.027587 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="abeb6e18-c927-4a7f-9f12-e129f39d7559" containerName="keystone-bootstrap" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.028834 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.034825 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.034905 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.035005 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.035014 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-4pg4m" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.035022 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.039222 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-4gqxj"] Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.076172 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerStarted","Data":"367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234"} Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.190754 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-fernet-keys\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.190819 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-credential-keys\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.190844 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46g6\" (UniqueName: \"kubernetes.io/projected/f4212e38-b90e-4dc4-b431-c8b943624ffd-kube-api-access-t46g6\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.190923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-config-data\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.190947 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-scripts\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.190965 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-combined-ca-bundle\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.218315 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abeb6e18-c927-4a7f-9f12-e129f39d7559" path="/var/lib/kubelet/pods/abeb6e18-c927-4a7f-9f12-e129f39d7559/volumes" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.292761 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-config-data\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.293109 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-scripts\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.293193 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-combined-ca-bundle\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.293302 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-fernet-keys\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.293389 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-credential-keys\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.293452 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46g6\" (UniqueName: \"kubernetes.io/projected/f4212e38-b90e-4dc4-b431-c8b943624ffd-kube-api-access-t46g6\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.298144 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-scripts\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.298383 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-credential-keys\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.299212 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-config-data\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.299616 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-fernet-keys\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.308813 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-combined-ca-bundle\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.315999 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46g6\" (UniqueName: \"kubernetes.io/projected/f4212e38-b90e-4dc4-b431-c8b943624ffd-kube-api-access-t46g6\") pod \"keystone-bootstrap-4gqxj\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.365311 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:21 crc kubenswrapper[4956]: I0314 09:18:21.796386 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-4gqxj"] Mar 14 09:18:21 crc kubenswrapper[4956]: W0314 09:18:21.803505 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4212e38_b90e_4dc4_b431_c8b943624ffd.slice/crio-f434df6b2ab4be41f94e7b407e61975e03a315000580bdc30a2c782276ab9ee1 WatchSource:0}: Error finding container f434df6b2ab4be41f94e7b407e61975e03a315000580bdc30a2c782276ab9ee1: Status 404 returned error can't find the container with id f434df6b2ab4be41f94e7b407e61975e03a315000580bdc30a2c782276ab9ee1 Mar 14 09:18:22 crc kubenswrapper[4956]: I0314 09:18:22.089762 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" event={"ID":"f4212e38-b90e-4dc4-b431-c8b943624ffd","Type":"ContainerStarted","Data":"f434df6b2ab4be41f94e7b407e61975e03a315000580bdc30a2c782276ab9ee1"} Mar 14 09:18:23 crc kubenswrapper[4956]: I0314 09:18:23.098867 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" event={"ID":"f4212e38-b90e-4dc4-b431-c8b943624ffd","Type":"ContainerStarted","Data":"6b8c0650e8787f9a9f1dfcb7fb9f7f30bfe5c4005e10c6d8c6b74845f8cb733d"} Mar 14 09:18:24 crc kubenswrapper[4956]: I0314 09:18:24.126993 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" podStartSLOduration=3.126974749 podStartE2EDuration="3.126974749s" podCreationTimestamp="2026-03-14 09:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:18:24.120842136 +0000 UTC m=+1309.633534414" watchObservedRunningTime="2026-03-14 09:18:24.126974749 +0000 UTC m=+1309.639667017" Mar 14 09:18:26 crc kubenswrapper[4956]: I0314 09:18:26.129573 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerStarted","Data":"90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6"} Mar 14 09:18:27 crc kubenswrapper[4956]: I0314 09:18:27.140693 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4212e38-b90e-4dc4-b431-c8b943624ffd" containerID="6b8c0650e8787f9a9f1dfcb7fb9f7f30bfe5c4005e10c6d8c6b74845f8cb733d" exitCode=0 Mar 14 09:18:27 crc kubenswrapper[4956]: I0314 09:18:27.140739 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" event={"ID":"f4212e38-b90e-4dc4-b431-c8b943624ffd","Type":"ContainerDied","Data":"6b8c0650e8787f9a9f1dfcb7fb9f7f30bfe5c4005e10c6d8c6b74845f8cb733d"} Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.451609 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.531396 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-fernet-keys\") pod \"f4212e38-b90e-4dc4-b431-c8b943624ffd\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.531497 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-credential-keys\") pod \"f4212e38-b90e-4dc4-b431-c8b943624ffd\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.531578 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-combined-ca-bundle\") pod \"f4212e38-b90e-4dc4-b431-c8b943624ffd\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.531613 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-scripts\") pod \"f4212e38-b90e-4dc4-b431-c8b943624ffd\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.531647 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t46g6\" (UniqueName: \"kubernetes.io/projected/f4212e38-b90e-4dc4-b431-c8b943624ffd-kube-api-access-t46g6\") pod \"f4212e38-b90e-4dc4-b431-c8b943624ffd\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.531673 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-config-data\") pod \"f4212e38-b90e-4dc4-b431-c8b943624ffd\" (UID: \"f4212e38-b90e-4dc4-b431-c8b943624ffd\") " Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.537285 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f4212e38-b90e-4dc4-b431-c8b943624ffd" (UID: "f4212e38-b90e-4dc4-b431-c8b943624ffd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.537452 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-scripts" (OuterVolumeSpecName: "scripts") pod "f4212e38-b90e-4dc4-b431-c8b943624ffd" (UID: "f4212e38-b90e-4dc4-b431-c8b943624ffd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.537888 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f4212e38-b90e-4dc4-b431-c8b943624ffd" (UID: "f4212e38-b90e-4dc4-b431-c8b943624ffd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.538269 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4212e38-b90e-4dc4-b431-c8b943624ffd-kube-api-access-t46g6" (OuterVolumeSpecName: "kube-api-access-t46g6") pod "f4212e38-b90e-4dc4-b431-c8b943624ffd" (UID: "f4212e38-b90e-4dc4-b431-c8b943624ffd"). InnerVolumeSpecName "kube-api-access-t46g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.554621 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-config-data" (OuterVolumeSpecName: "config-data") pod "f4212e38-b90e-4dc4-b431-c8b943624ffd" (UID: "f4212e38-b90e-4dc4-b431-c8b943624ffd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.557538 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4212e38-b90e-4dc4-b431-c8b943624ffd" (UID: "f4212e38-b90e-4dc4-b431-c8b943624ffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.633212 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.633245 4956 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.633257 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.633266 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.633274 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t46g6\" (UniqueName: \"kubernetes.io/projected/f4212e38-b90e-4dc4-b431-c8b943624ffd-kube-api-access-t46g6\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:28 crc kubenswrapper[4956]: I0314 09:18:28.633284 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4212e38-b90e-4dc4-b431-c8b943624ffd-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.161155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" event={"ID":"f4212e38-b90e-4dc4-b431-c8b943624ffd","Type":"ContainerDied","Data":"f434df6b2ab4be41f94e7b407e61975e03a315000580bdc30a2c782276ab9ee1"} Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.161443 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f434df6b2ab4be41f94e7b407e61975e03a315000580bdc30a2c782276ab9ee1" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.161221 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-4gqxj" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.282751 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-6b8d58878f-dcbq5"] Mar 14 09:18:29 crc kubenswrapper[4956]: E0314 09:18:29.283131 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4212e38-b90e-4dc4-b431-c8b943624ffd" containerName="keystone-bootstrap" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.283148 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4212e38-b90e-4dc4-b431-c8b943624ffd" containerName="keystone-bootstrap" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.283290 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4212e38-b90e-4dc4-b431-c8b943624ffd" containerName="keystone-bootstrap" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.283903 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.293670 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.293896 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-internal-svc" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.294178 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.295805 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.308888 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-4pg4m" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.309227 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-public-svc" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.312701 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-6b8d58878f-dcbq5"] Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448107 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zzxq\" (UniqueName: \"kubernetes.io/projected/47b1a93a-9d42-4390-a824-7a86cd4f9be0-kube-api-access-5zzxq\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448161 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-public-tls-certs\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448188 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-combined-ca-bundle\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448375 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-internal-tls-certs\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448468 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-scripts\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448600 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-config-data\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448676 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-fernet-keys\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.448713 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-credential-keys\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550365 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zzxq\" (UniqueName: \"kubernetes.io/projected/47b1a93a-9d42-4390-a824-7a86cd4f9be0-kube-api-access-5zzxq\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-public-tls-certs\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550435 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-combined-ca-bundle\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550476 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-internal-tls-certs\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550535 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-scripts\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550569 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-config-data\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-fernet-keys\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.550622 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-credential-keys\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.555345 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-combined-ca-bundle\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.555630 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-fernet-keys\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.555671 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-credential-keys\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.556218 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-config-data\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.556435 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-internal-tls-certs\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.557589 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-public-tls-certs\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.560880 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-scripts\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.572172 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zzxq\" (UniqueName: \"kubernetes.io/projected/47b1a93a-9d42-4390-a824-7a86cd4f9be0-kube-api-access-5zzxq\") pod \"keystone-6b8d58878f-dcbq5\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:29 crc kubenswrapper[4956]: I0314 09:18:29.610254 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:33 crc kubenswrapper[4956]: I0314 09:18:33.449434 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-6b8d58878f-dcbq5"] Mar 14 09:18:33 crc kubenswrapper[4956]: W0314 09:18:33.451691 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b1a93a_9d42_4390_a824_7a86cd4f9be0.slice/crio-0cf9f0458d91112c9274f0f0fbbc7fb3ee5fbed0b4eed91c137aa075f7f27cf7 WatchSource:0}: Error finding container 0cf9f0458d91112c9274f0f0fbbc7fb3ee5fbed0b4eed91c137aa075f7f27cf7: Status 404 returned error can't find the container with id 0cf9f0458d91112c9274f0f0fbbc7fb3ee5fbed0b4eed91c137aa075f7f27cf7 Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.221940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" event={"ID":"47b1a93a-9d42-4390-a824-7a86cd4f9be0","Type":"ContainerStarted","Data":"55cbaf0e1b37a8595c5ef6237c4da433e2d1f7de027ebbc5a3c44805950cff57"} Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.222011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" event={"ID":"47b1a93a-9d42-4390-a824-7a86cd4f9be0","Type":"ContainerStarted","Data":"0cf9f0458d91112c9274f0f0fbbc7fb3ee5fbed0b4eed91c137aa075f7f27cf7"} Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.222050 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.226047 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerStarted","Data":"bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716"} Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.226310 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-central-agent" containerID="cri-o://29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea" gracePeriod=30 Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.226497 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="sg-core" containerID="cri-o://90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6" gracePeriod=30 Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.226525 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="proxy-httpd" containerID="cri-o://bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716" gracePeriod=30 Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.226533 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-notification-agent" containerID="cri-o://367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234" gracePeriod=30 Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.226508 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.259424 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" podStartSLOduration=5.25939275 podStartE2EDuration="5.25939275s" podCreationTimestamp="2026-03-14 09:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:18:34.249578975 +0000 UTC m=+1319.762271243" watchObservedRunningTime="2026-03-14 09:18:34.25939275 +0000 UTC m=+1319.772085018" Mar 14 09:18:34 crc kubenswrapper[4956]: I0314 09:18:34.283411 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.593075756 podStartE2EDuration="25.283380188s" podCreationTimestamp="2026-03-14 09:18:09 +0000 UTC" firstStartedPulling="2026-03-14 09:18:10.446957143 +0000 UTC m=+1295.959649411" lastFinishedPulling="2026-03-14 09:18:33.137261575 +0000 UTC m=+1318.649953843" observedRunningTime="2026-03-14 09:18:34.268940348 +0000 UTC m=+1319.781632626" watchObservedRunningTime="2026-03-14 09:18:34.283380188 +0000 UTC m=+1319.796072476" Mar 14 09:18:35 crc kubenswrapper[4956]: I0314 09:18:35.237361 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerID="bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716" exitCode=0 Mar 14 09:18:35 crc kubenswrapper[4956]: I0314 09:18:35.237726 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerID="90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6" exitCode=2 Mar 14 09:18:35 crc kubenswrapper[4956]: I0314 09:18:35.237740 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerID="29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea" exitCode=0 Mar 14 09:18:35 crc kubenswrapper[4956]: I0314 09:18:35.237949 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerDied","Data":"bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716"} Mar 14 09:18:35 crc kubenswrapper[4956]: I0314 09:18:35.237990 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerDied","Data":"90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6"} Mar 14 09:18:35 crc kubenswrapper[4956]: I0314 09:18:35.238003 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerDied","Data":"29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea"} Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.647617 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765720 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-scripts\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765777 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-sg-core-conf-yaml\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765824 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-combined-ca-bundle\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765858 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-run-httpd\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765876 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-config-data\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765906 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd5kt\" (UniqueName: \"kubernetes.io/projected/0e051689-f994-449c-9cf1-76d7f6b2c78b-kube-api-access-jd5kt\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.765946 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-log-httpd\") pod \"0e051689-f994-449c-9cf1-76d7f6b2c78b\" (UID: \"0e051689-f994-449c-9cf1-76d7f6b2c78b\") " Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.766659 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.767198 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.771345 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-scripts" (OuterVolumeSpecName: "scripts") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.771383 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e051689-f994-449c-9cf1-76d7f6b2c78b-kube-api-access-jd5kt" (OuterVolumeSpecName: "kube-api-access-jd5kt") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "kube-api-access-jd5kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.788820 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.832931 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.844351 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-config-data" (OuterVolumeSpecName: "config-data") pod "0e051689-f994-449c-9cf1-76d7f6b2c78b" (UID: "0e051689-f994-449c-9cf1-76d7f6b2c78b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868019 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868058 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868095 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868107 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868119 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e051689-f994-449c-9cf1-76d7f6b2c78b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868129 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd5kt\" (UniqueName: \"kubernetes.io/projected/0e051689-f994-449c-9cf1-76d7f6b2c78b-kube-api-access-jd5kt\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:36 crc kubenswrapper[4956]: I0314 09:18:36.868139 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e051689-f994-449c-9cf1-76d7f6b2c78b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.263760 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.263813 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerDied","Data":"367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234"} Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.263701 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerID="367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234" exitCode=0 Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.264662 4956 scope.go:117] "RemoveContainer" containerID="bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.264682 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"0e051689-f994-449c-9cf1-76d7f6b2c78b","Type":"ContainerDied","Data":"8915743bb15cda60b80e944e11b09ee07f76be2cc215d47ff213ddd90f6a404c"} Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.287227 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.290321 4956 scope.go:117] "RemoveContainer" containerID="90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.294942 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.313205 4956 scope.go:117] "RemoveContainer" containerID="367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.313426 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.313792 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-notification-agent" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.313809 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-notification-agent" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.313833 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-central-agent" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.313842 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-central-agent" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.313856 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="sg-core" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.313864 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="sg-core" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.313878 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="proxy-httpd" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.313885 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="proxy-httpd" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.314033 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="proxy-httpd" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.314050 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-central-agent" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.314061 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="sg-core" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.314071 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" containerName="ceilometer-notification-agent" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.315463 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.322819 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.322856 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.338920 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.341765 4956 scope.go:117] "RemoveContainer" containerID="29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.376654 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.376717 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-log-httpd\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.376741 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.376923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkbdz\" (UniqueName: \"kubernetes.io/projected/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-kube-api-access-gkbdz\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.376982 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-config-data\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.377076 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-scripts\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.377193 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-run-httpd\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.385088 4956 scope.go:117] "RemoveContainer" containerID="bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.385518 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716\": container with ID starting with bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716 not found: ID does not exist" containerID="bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.385559 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716"} err="failed to get container status \"bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716\": rpc error: code = NotFound desc = could not find container \"bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716\": container with ID starting with bef78f3775781a7d0ef2ab2974d2ba68186160d5c94fa57b5f7966b2fe8c6716 not found: ID does not exist" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.385580 4956 scope.go:117] "RemoveContainer" containerID="90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.386846 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6\": container with ID starting with 90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6 not found: ID does not exist" containerID="90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.386906 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6"} err="failed to get container status \"90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6\": rpc error: code = NotFound desc = could not find container \"90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6\": container with ID starting with 90a304efcd57ab395d3441c805c78d779eef236333973c8105196d4e155a10f6 not found: ID does not exist" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.386940 4956 scope.go:117] "RemoveContainer" containerID="367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.387393 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234\": container with ID starting with 367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234 not found: ID does not exist" containerID="367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.387419 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234"} err="failed to get container status \"367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234\": rpc error: code = NotFound desc = could not find container \"367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234\": container with ID starting with 367e29cfa382a66aed956f633ba0f378f310f161a24aef2c0bcaf03195520234 not found: ID does not exist" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.387437 4956 scope.go:117] "RemoveContainer" containerID="29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea" Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.387772 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea\": container with ID starting with 29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea not found: ID does not exist" containerID="29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.387811 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea"} err="failed to get container status \"29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea\": rpc error: code = NotFound desc = could not find container \"29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea\": container with ID starting with 29b979e03d61371fb898d3ada829b3023ac4c6dc560e785e9dfb5b8b3b1985ea not found: ID does not exist" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.405633 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:37 crc kubenswrapper[4956]: E0314 09:18:37.406157 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-gkbdz log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/ceilometer-0" podUID="8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478352 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkbdz\" (UniqueName: \"kubernetes.io/projected/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-kube-api-access-gkbdz\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478407 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-config-data\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478448 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-scripts\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478509 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-run-httpd\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478526 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478559 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.478575 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-log-httpd\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.479068 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-log-httpd\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.479302 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-run-httpd\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.485359 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.485435 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.485587 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-scripts\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.485806 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-config-data\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:37 crc kubenswrapper[4956]: I0314 09:18:37.495441 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkbdz\" (UniqueName: \"kubernetes.io/projected/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-kube-api-access-gkbdz\") pod \"ceilometer-0\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.273234 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.286456 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392510 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-config-data\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392575 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-scripts\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392644 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-sg-core-conf-yaml\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392687 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-run-httpd\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392732 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkbdz\" (UniqueName: \"kubernetes.io/projected/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-kube-api-access-gkbdz\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392766 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-log-httpd\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.392791 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-combined-ca-bundle\") pod \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\" (UID: \"8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a\") " Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.393162 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.393671 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.402343 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.403752 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-kube-api-access-gkbdz" (OuterVolumeSpecName: "kube-api-access-gkbdz") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "kube-api-access-gkbdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.404619 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-config-data" (OuterVolumeSpecName: "config-data") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.406736 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.408625 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-scripts" (OuterVolumeSpecName: "scripts") pod "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" (UID: "8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750674 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750718 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkbdz\" (UniqueName: \"kubernetes.io/projected/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-kube-api-access-gkbdz\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750735 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750746 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750759 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750771 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:38 crc kubenswrapper[4956]: I0314 09:18:38.750781 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.222155 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e051689-f994-449c-9cf1-76d7f6b2c78b" path="/var/lib/kubelet/pods/0e051689-f994-449c-9cf1-76d7f6b2c78b/volumes" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.281295 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.321910 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.328016 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.348142 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.353315 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.356666 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.366861 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.367296 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.367962 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-run-httpd\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.367996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-scripts\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.368025 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.368132 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-config-data\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.368179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.368226 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvfh\" (UniqueName: \"kubernetes.io/projected/40d2917d-4096-4317-97e7-d84af851611b-kube-api-access-7dvfh\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.368375 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-log-httpd\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470397 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470455 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-config-data\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470509 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470545 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvfh\" (UniqueName: \"kubernetes.io/projected/40d2917d-4096-4317-97e7-d84af851611b-kube-api-access-7dvfh\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470625 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-log-httpd\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470682 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-run-httpd\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.470703 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-scripts\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.471320 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-log-httpd\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.471530 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-run-httpd\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.474617 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.475569 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.476194 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-scripts\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.476661 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-config-data\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.489794 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvfh\" (UniqueName: \"kubernetes.io/projected/40d2917d-4096-4317-97e7-d84af851611b-kube-api-access-7dvfh\") pod \"ceilometer-0\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:39 crc kubenswrapper[4956]: I0314 09:18:39.684177 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:40 crc kubenswrapper[4956]: I0314 09:18:40.159984 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:18:40 crc kubenswrapper[4956]: I0314 09:18:40.289724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerStarted","Data":"77c618a3a45337513e6b03364b54376e099f2ac0cddbff6eea95ddcbea04af6f"} Mar 14 09:18:41 crc kubenswrapper[4956]: I0314 09:18:41.225617 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a" path="/var/lib/kubelet/pods/8204ccdf-38dc-4ee5-9c91-26ea7d64fe4a/volumes" Mar 14 09:18:43 crc kubenswrapper[4956]: I0314 09:18:43.317914 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerStarted","Data":"3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb"} Mar 14 09:18:44 crc kubenswrapper[4956]: I0314 09:18:44.329619 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerStarted","Data":"232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5"} Mar 14 09:18:45 crc kubenswrapper[4956]: I0314 09:18:45.341623 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerStarted","Data":"f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6"} Mar 14 09:18:46 crc kubenswrapper[4956]: I0314 09:18:46.351041 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerStarted","Data":"67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb"} Mar 14 09:18:46 crc kubenswrapper[4956]: I0314 09:18:46.352374 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:18:46 crc kubenswrapper[4956]: I0314 09:18:46.376125 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.717642138 podStartE2EDuration="7.376110194s" podCreationTimestamp="2026-03-14 09:18:39 +0000 UTC" firstStartedPulling="2026-03-14 09:18:40.165890883 +0000 UTC m=+1325.678583151" lastFinishedPulling="2026-03-14 09:18:45.824358909 +0000 UTC m=+1331.337051207" observedRunningTime="2026-03-14 09:18:46.372667398 +0000 UTC m=+1331.885359666" watchObservedRunningTime="2026-03-14 09:18:46.376110194 +0000 UTC m=+1331.888802462" Mar 14 09:18:53 crc kubenswrapper[4956]: I0314 09:18:53.961751 4956 scope.go:117] "RemoveContainer" containerID="9635f7225b64ac8dfdd57c9a5fb12f2e4c04e99bf58daba2f2b4bbdfa0826df0" Mar 14 09:18:55 crc kubenswrapper[4956]: I0314 09:18:55.423466 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:18:55 crc kubenswrapper[4956]: I0314 09:18:55.423822 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:19:01 crc kubenswrapper[4956]: I0314 09:19:01.312244 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.559332 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.560911 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.563052 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.564724 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstackclient-openstackclient-dockercfg-k8htp" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.571953 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-config-secret" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.577161 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.709833 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.709896 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config-secret\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.709953 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-combined-ca-bundle\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.710004 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bp9w\" (UniqueName: \"kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.811616 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-combined-ca-bundle\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.812050 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bp9w\" (UniqueName: \"kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.812188 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.812317 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config-secret\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.813344 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.820393 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.821250 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-combined-ca-bundle\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: E0314 09:19:05.821474 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-4bp9w openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/openstackclient" podUID="354ebd6e-ffb3-4a8e-9725-424c863614ea" Mar 14 09:19:05 crc kubenswrapper[4956]: E0314 09:19:05.826886 4956 projected.go:194] Error preparing data for projected volume kube-api-access-4bp9w for pod watcher-kuttl-default/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "watcher-kuttl-default": no relationship found between node 'crc' and this object Mar 14 09:19:05 crc kubenswrapper[4956]: E0314 09:19:05.826952 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w podName:354ebd6e-ffb3-4a8e-9725-424c863614ea nodeName:}" failed. No retries permitted until 2026-03-14 09:19:06.326933909 +0000 UTC m=+1351.839626177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4bp9w" (UniqueName: "kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w") pod "openstackclient" (UID: "354ebd6e-ffb3-4a8e-9725-424c863614ea") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "watcher-kuttl-default": no relationship found between node 'crc' and this object Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.827633 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config-secret\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.846265 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.856821 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.857807 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:05 crc kubenswrapper[4956]: I0314 09:19:05.872010 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.018442 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4fbb48a6-d5ed-423b-90e4-809c839c8675-openstack-config-secret\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.018511 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4fbb48a6-d5ed-423b-90e4-809c839c8675-openstack-config\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.018553 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbb48a6-d5ed-423b-90e4-809c839c8675-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.018682 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxg8\" (UniqueName: \"kubernetes.io/projected/4fbb48a6-d5ed-423b-90e4-809c839c8675-kube-api-access-pgxg8\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.120556 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4fbb48a6-d5ed-423b-90e4-809c839c8675-openstack-config-secret\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.121238 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4fbb48a6-d5ed-423b-90e4-809c839c8675-openstack-config\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.121283 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbb48a6-d5ed-423b-90e4-809c839c8675-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.121375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxg8\" (UniqueName: \"kubernetes.io/projected/4fbb48a6-d5ed-423b-90e4-809c839c8675-kube-api-access-pgxg8\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.121715 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4fbb48a6-d5ed-423b-90e4-809c839c8675-openstack-config\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.126124 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4fbb48a6-d5ed-423b-90e4-809c839c8675-openstack-config-secret\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.126235 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbb48a6-d5ed-423b-90e4-809c839c8675-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.141955 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxg8\" (UniqueName: \"kubernetes.io/projected/4fbb48a6-d5ed-423b-90e4-809c839c8675-kube-api-access-pgxg8\") pod \"openstackclient\" (UID: \"4fbb48a6-d5ed-423b-90e4-809c839c8675\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.231436 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.344765 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bp9w\" (UniqueName: \"kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w\") pod \"openstackclient\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: E0314 09:19:06.349836 4956 projected.go:194] Error preparing data for projected volume kube-api-access-4bp9w for pod watcher-kuttl-default/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (354ebd6e-ffb3-4a8e-9725-424c863614ea) does not match the UID in record. The object might have been deleted and then recreated Mar 14 09:19:06 crc kubenswrapper[4956]: E0314 09:19:06.349926 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w podName:354ebd6e-ffb3-4a8e-9725-424c863614ea nodeName:}" failed. No retries permitted until 2026-03-14 09:19:07.349902336 +0000 UTC m=+1352.862594654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4bp9w" (UniqueName: "kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w") pod "openstackclient" (UID: "354ebd6e-ffb3-4a8e-9725-424c863614ea") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (354ebd6e-ffb3-4a8e-9725-424c863614ea) does not match the UID in record. The object might have been deleted and then recreated Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.502881 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.506555 4956 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="watcher-kuttl-default/openstackclient" oldPodUID="354ebd6e-ffb3-4a8e-9725-424c863614ea" podUID="4fbb48a6-d5ed-423b-90e4-809c839c8675" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.516072 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.548071 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-combined-ca-bundle\") pod \"354ebd6e-ffb3-4a8e-9725-424c863614ea\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.548221 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config-secret\") pod \"354ebd6e-ffb3-4a8e-9725-424c863614ea\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.548301 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config\") pod \"354ebd6e-ffb3-4a8e-9725-424c863614ea\" (UID: \"354ebd6e-ffb3-4a8e-9725-424c863614ea\") " Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.548544 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bp9w\" (UniqueName: \"kubernetes.io/projected/354ebd6e-ffb3-4a8e-9725-424c863614ea-kube-api-access-4bp9w\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.548829 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "354ebd6e-ffb3-4a8e-9725-424c863614ea" (UID: "354ebd6e-ffb3-4a8e-9725-424c863614ea"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.551455 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "354ebd6e-ffb3-4a8e-9725-424c863614ea" (UID: "354ebd6e-ffb3-4a8e-9725-424c863614ea"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.551797 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "354ebd6e-ffb3-4a8e-9725-424c863614ea" (UID: "354ebd6e-ffb3-4a8e-9725-424c863614ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.649541 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.649575 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/354ebd6e-ffb3-4a8e-9725-424c863614ea-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.649585 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354ebd6e-ffb3-4a8e-9725-424c863614ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:06 crc kubenswrapper[4956]: I0314 09:19:06.710399 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Mar 14 09:19:07 crc kubenswrapper[4956]: I0314 09:19:07.219896 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354ebd6e-ffb3-4a8e-9725-424c863614ea" path="/var/lib/kubelet/pods/354ebd6e-ffb3-4a8e-9725-424c863614ea/volumes" Mar 14 09:19:07 crc kubenswrapper[4956]: I0314 09:19:07.510109 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Mar 14 09:19:07 crc kubenswrapper[4956]: I0314 09:19:07.510101 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"4fbb48a6-d5ed-423b-90e4-809c839c8675","Type":"ContainerStarted","Data":"b3c21eb6150394d5793e31dde7fc7f0cbdd49ea36e246081fe33b4bc8818acce"} Mar 14 09:19:07 crc kubenswrapper[4956]: I0314 09:19:07.518532 4956 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="watcher-kuttl-default/openstackclient" oldPodUID="354ebd6e-ffb3-4a8e-9725-424c863614ea" podUID="4fbb48a6-d5ed-423b-90e4-809c839c8675" Mar 14 09:19:09 crc kubenswrapper[4956]: I0314 09:19:09.688845 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:12 crc kubenswrapper[4956]: I0314 09:19:12.096846 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:19:12 crc kubenswrapper[4956]: I0314 09:19:12.099622 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="dcc01311-198b-436e-b438-5229352baf03" containerName="kube-state-metrics" containerID="cri-o://8e987a2758a809eda8aab5c2e246171d528b2215024d0907ad0e630edb027c5a" gracePeriod=30 Mar 14 09:19:12 crc kubenswrapper[4956]: I0314 09:19:12.553936 4956 generic.go:334] "Generic (PLEG): container finished" podID="dcc01311-198b-436e-b438-5229352baf03" containerID="8e987a2758a809eda8aab5c2e246171d528b2215024d0907ad0e630edb027c5a" exitCode=2 Mar 14 09:19:12 crc kubenswrapper[4956]: I0314 09:19:12.553988 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"dcc01311-198b-436e-b438-5229352baf03","Type":"ContainerDied","Data":"8e987a2758a809eda8aab5c2e246171d528b2215024d0907ad0e630edb027c5a"} Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.085466 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.086372 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-central-agent" containerID="cri-o://3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb" gracePeriod=30 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.086502 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="sg-core" containerID="cri-o://f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6" gracePeriod=30 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.086605 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="proxy-httpd" containerID="cri-o://67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb" gracePeriod=30 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.086695 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-notification-agent" containerID="cri-o://232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5" gracePeriod=30 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.569473 4956 generic.go:334] "Generic (PLEG): container finished" podID="40d2917d-4096-4317-97e7-d84af851611b" containerID="67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb" exitCode=0 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.569527 4956 generic.go:334] "Generic (PLEG): container finished" podID="40d2917d-4096-4317-97e7-d84af851611b" containerID="f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6" exitCode=2 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.569535 4956 generic.go:334] "Generic (PLEG): container finished" podID="40d2917d-4096-4317-97e7-d84af851611b" containerID="3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb" exitCode=0 Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.569546 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerDied","Data":"67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb"} Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.569654 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerDied","Data":"f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6"} Mar 14 09:19:13 crc kubenswrapper[4956]: I0314 09:19:13.569675 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerDied","Data":"3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb"} Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.191285 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.235379 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-log-httpd\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.235912 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-combined-ca-bundle\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236019 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-scripts\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-sg-core-conf-yaml\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236293 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236309 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-run-httpd\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236444 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-config-data\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236535 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dvfh\" (UniqueName: \"kubernetes.io/projected/40d2917d-4096-4317-97e7-d84af851611b-kube-api-access-7dvfh\") pod \"40d2917d-4096-4317-97e7-d84af851611b\" (UID: \"40d2917d-4096-4317-97e7-d84af851611b\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.236858 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.237245 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.241842 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d2917d-4096-4317-97e7-d84af851611b-kube-api-access-7dvfh" (OuterVolumeSpecName: "kube-api-access-7dvfh") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "kube-api-access-7dvfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.244146 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-scripts" (OuterVolumeSpecName: "scripts") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.260707 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.285346 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.338602 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzzvc\" (UniqueName: \"kubernetes.io/projected/dcc01311-198b-436e-b438-5229352baf03-kube-api-access-bzzvc\") pod \"dcc01311-198b-436e-b438-5229352baf03\" (UID: \"dcc01311-198b-436e-b438-5229352baf03\") " Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.338974 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dvfh\" (UniqueName: \"kubernetes.io/projected/40d2917d-4096-4317-97e7-d84af851611b-kube-api-access-7dvfh\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.338989 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.339000 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.339009 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40d2917d-4096-4317-97e7-d84af851611b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.344342 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc01311-198b-436e-b438-5229352baf03-kube-api-access-bzzvc" (OuterVolumeSpecName: "kube-api-access-bzzvc") pod "dcc01311-198b-436e-b438-5229352baf03" (UID: "dcc01311-198b-436e-b438-5229352baf03"). InnerVolumeSpecName "kube-api-access-bzzvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.356676 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-config-data" (OuterVolumeSpecName: "config-data") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.365046 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40d2917d-4096-4317-97e7-d84af851611b" (UID: "40d2917d-4096-4317-97e7-d84af851611b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.440919 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzzvc\" (UniqueName: \"kubernetes.io/projected/dcc01311-198b-436e-b438-5229352baf03-kube-api-access-bzzvc\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.440968 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.440979 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40d2917d-4096-4317-97e7-d84af851611b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.591192 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"dcc01311-198b-436e-b438-5229352baf03","Type":"ContainerDied","Data":"63778119c816b07d5a90abcb435cdeb5d31e8d2c61bbd6168ebd669372cb5549"} Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.591278 4956 scope.go:117] "RemoveContainer" containerID="8e987a2758a809eda8aab5c2e246171d528b2215024d0907ad0e630edb027c5a" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.591782 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.594301 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"4fbb48a6-d5ed-423b-90e4-809c839c8675","Type":"ContainerStarted","Data":"ec0c07e1f4d1c29f6b3a91730ff6085aa390e2a413bfa130c9eb55c91fdf0ee3"} Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.625562 4956 generic.go:334] "Generic (PLEG): container finished" podID="40d2917d-4096-4317-97e7-d84af851611b" containerID="232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5" exitCode=0 Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.626710 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.628046 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerDied","Data":"232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5"} Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.628107 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"40d2917d-4096-4317-97e7-d84af851611b","Type":"ContainerDied","Data":"77c618a3a45337513e6b03364b54376e099f2ac0cddbff6eea95ddcbea04af6f"} Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.641642 4956 scope.go:117] "RemoveContainer" containerID="67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.641840 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstackclient" podStartSLOduration=2.392641199 podStartE2EDuration="10.641822157s" podCreationTimestamp="2026-03-14 09:19:05 +0000 UTC" firstStartedPulling="2026-03-14 09:19:06.716955403 +0000 UTC m=+1352.229647681" lastFinishedPulling="2026-03-14 09:19:14.966136381 +0000 UTC m=+1360.478828639" observedRunningTime="2026-03-14 09:19:15.632607937 +0000 UTC m=+1361.145300205" watchObservedRunningTime="2026-03-14 09:19:15.641822157 +0000 UTC m=+1361.154514425" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.663542 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.668315 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.674099 4956 scope.go:117] "RemoveContainer" containerID="f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.699119 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.705153 4956 scope.go:117] "RemoveContainer" containerID="232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.714372 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.731668 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.732081 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-central-agent" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732104 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-central-agent" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.732124 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="proxy-httpd" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732133 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="proxy-httpd" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.732150 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="sg-core" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732158 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="sg-core" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.732178 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-notification-agent" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732186 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-notification-agent" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.732199 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc01311-198b-436e-b438-5229352baf03" containerName="kube-state-metrics" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732209 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc01311-198b-436e-b438-5229352baf03" containerName="kube-state-metrics" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732392 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-notification-agent" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732405 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="sg-core" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732416 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="ceilometer-central-agent" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732427 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d2917d-4096-4317-97e7-d84af851611b" containerName="proxy-httpd" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.732446 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc01311-198b-436e-b438-5229352baf03" containerName="kube-state-metrics" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.733145 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.739084 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-8w9fh" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.739139 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-kube-state-metrics-svc" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.739338 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"kube-state-metrics-tls-config" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.747745 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.754690 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.754729 4956 scope.go:117] "RemoveContainer" containerID="3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.763787 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.775339 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.775543 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.775730 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.782798 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.794742 4956 scope.go:117] "RemoveContainer" containerID="67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.795364 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb\": container with ID starting with 67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb not found: ID does not exist" containerID="67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.795426 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb"} err="failed to get container status \"67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb\": rpc error: code = NotFound desc = could not find container \"67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb\": container with ID starting with 67d6db4175e8bc3e75c5641d2915c06a5138a6c976f945695f1bbd728970f6bb not found: ID does not exist" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.795462 4956 scope.go:117] "RemoveContainer" containerID="f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.797611 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6\": container with ID starting with f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6 not found: ID does not exist" containerID="f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.797647 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6"} err="failed to get container status \"f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6\": rpc error: code = NotFound desc = could not find container \"f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6\": container with ID starting with f94b02cb9c8c0680f7daf758a665e816c6eeb8ebcc9b75819a677c89efba03c6 not found: ID does not exist" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.797663 4956 scope.go:117] "RemoveContainer" containerID="232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.798558 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5\": container with ID starting with 232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5 not found: ID does not exist" containerID="232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.798580 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5"} err="failed to get container status \"232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5\": rpc error: code = NotFound desc = could not find container \"232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5\": container with ID starting with 232e6e2911d15f7c7960793153def3aa21571998f38cf099d85f431a0a395ea5 not found: ID does not exist" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.798593 4956 scope.go:117] "RemoveContainer" containerID="3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb" Mar 14 09:19:15 crc kubenswrapper[4956]: E0314 09:19:15.798876 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb\": container with ID starting with 3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb not found: ID does not exist" containerID="3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.798897 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb"} err="failed to get container status \"3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb\": rpc error: code = NotFound desc = could not find container \"3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb\": container with ID starting with 3d1168fc8e9e4b1c7ea3d3bc83955f284332032c76c85a315d0819c4da4be9bb not found: ID does not exist" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.850673 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.850749 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.850792 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5sm\" (UniqueName: \"kubernetes.io/projected/0e00a078-7836-4348-b513-0c8af77d837d-kube-api-access-6z5sm\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.850835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.951967 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5sm\" (UniqueName: \"kubernetes.io/projected/0e00a078-7836-4348-b513-0c8af77d837d-kube-api-access-6z5sm\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952037 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952090 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-scripts\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952157 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952217 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-log-httpd\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952281 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952354 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-run-httpd\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952413 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.952469 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m949\" (UniqueName: \"kubernetes.io/projected/b68eaedb-1389-45e7-b145-f8ad57fb4aca-kube-api-access-7m949\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.953279 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.953314 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-config-data\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.953363 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.957416 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.959468 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.971033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e00a078-7836-4348-b513-0c8af77d837d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:15 crc kubenswrapper[4956]: I0314 09:19:15.972097 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5sm\" (UniqueName: \"kubernetes.io/projected/0e00a078-7836-4348-b513-0c8af77d837d-kube-api-access-6z5sm\") pod \"kube-state-metrics-0\" (UID: \"0e00a078-7836-4348-b513-0c8af77d837d\") " pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.054733 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.054829 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-scripts\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.054858 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.054884 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-log-httpd\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.054936 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-run-httpd\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.054987 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m949\" (UniqueName: \"kubernetes.io/projected/b68eaedb-1389-45e7-b145-f8ad57fb4aca-kube-api-access-7m949\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.055014 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.055049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-config-data\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.055362 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-log-httpd\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.055544 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-run-httpd\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.055699 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.059359 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.059383 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.059968 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.062572 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-scripts\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.064066 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-config-data\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.077775 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m949\" (UniqueName: \"kubernetes.io/projected/b68eaedb-1389-45e7-b145-f8ad57fb4aca-kube-api-access-7m949\") pod \"ceilometer-0\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.107174 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.504283 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Mar 14 09:19:16 crc kubenswrapper[4956]: W0314 09:19:16.508495 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e00a078_7836_4348_b513_0c8af77d837d.slice/crio-4972f2edb6acbcd733af1c5d3f7d2c7eae608fbbb0b4ec0eb2b1f9ba62ec1e15 WatchSource:0}: Error finding container 4972f2edb6acbcd733af1c5d3f7d2c7eae608fbbb0b4ec0eb2b1f9ba62ec1e15: Status 404 returned error can't find the container with id 4972f2edb6acbcd733af1c5d3f7d2c7eae608fbbb0b4ec0eb2b1f9ba62ec1e15 Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.588316 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:19:16 crc kubenswrapper[4956]: W0314 09:19:16.591067 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb68eaedb_1389_45e7_b145_f8ad57fb4aca.slice/crio-32bf1d3af738cc20d14fcfa67288b3889cc3e695d5cf13129a2c42e3438b9b30 WatchSource:0}: Error finding container 32bf1d3af738cc20d14fcfa67288b3889cc3e695d5cf13129a2c42e3438b9b30: Status 404 returned error can't find the container with id 32bf1d3af738cc20d14fcfa67288b3889cc3e695d5cf13129a2c42e3438b9b30 Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.637891 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerStarted","Data":"32bf1d3af738cc20d14fcfa67288b3889cc3e695d5cf13129a2c42e3438b9b30"} Mar 14 09:19:16 crc kubenswrapper[4956]: I0314 09:19:16.639224 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"0e00a078-7836-4348-b513-0c8af77d837d","Type":"ContainerStarted","Data":"4972f2edb6acbcd733af1c5d3f7d2c7eae608fbbb0b4ec0eb2b1f9ba62ec1e15"} Mar 14 09:19:17 crc kubenswrapper[4956]: I0314 09:19:17.228051 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d2917d-4096-4317-97e7-d84af851611b" path="/var/lib/kubelet/pods/40d2917d-4096-4317-97e7-d84af851611b/volumes" Mar 14 09:19:17 crc kubenswrapper[4956]: I0314 09:19:17.230071 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc01311-198b-436e-b438-5229352baf03" path="/var/lib/kubelet/pods/dcc01311-198b-436e-b438-5229352baf03/volumes" Mar 14 09:19:17 crc kubenswrapper[4956]: I0314 09:19:17.647876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerStarted","Data":"96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6"} Mar 14 09:19:17 crc kubenswrapper[4956]: I0314 09:19:17.649034 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"0e00a078-7836-4348-b513-0c8af77d837d","Type":"ContainerStarted","Data":"9920a05b3e8dda1f900e113c3251c8b0e1be840cffa4ba2e1680eac87e5450f8"} Mar 14 09:19:17 crc kubenswrapper[4956]: I0314 09:19:17.650049 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:17 crc kubenswrapper[4956]: I0314 09:19:17.677223 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.338005274 podStartE2EDuration="2.677206786s" podCreationTimestamp="2026-03-14 09:19:15 +0000 UTC" firstStartedPulling="2026-03-14 09:19:16.51052771 +0000 UTC m=+1362.023219978" lastFinishedPulling="2026-03-14 09:19:16.849729222 +0000 UTC m=+1362.362421490" observedRunningTime="2026-03-14 09:19:17.669170245 +0000 UTC m=+1363.181862513" watchObservedRunningTime="2026-03-14 09:19:17.677206786 +0000 UTC m=+1363.189899054" Mar 14 09:19:18 crc kubenswrapper[4956]: I0314 09:19:18.659273 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerStarted","Data":"155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e"} Mar 14 09:19:18 crc kubenswrapper[4956]: I0314 09:19:18.659577 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerStarted","Data":"92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef"} Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.819608 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-m6s6v"] Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.820967 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.835951 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-0215-account-create-update-5sn9p"] Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.837023 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.838765 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.859512 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-m6s6v"] Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.887072 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-0215-account-create-update-5sn9p"] Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.920604 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jwk\" (UniqueName: \"kubernetes.io/projected/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-kube-api-access-t2jwk\") pod \"watcher-db-create-m6s6v\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:19 crc kubenswrapper[4956]: I0314 09:19:19.920829 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-operator-scripts\") pod \"watcher-db-create-m6s6v\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.022052 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jwk\" (UniqueName: \"kubernetes.io/projected/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-kube-api-access-t2jwk\") pod \"watcher-db-create-m6s6v\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.022118 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpjbp\" (UniqueName: \"kubernetes.io/projected/57cae041-75e9-407f-8d18-3c0da11e7a73-kube-api-access-rpjbp\") pod \"watcher-0215-account-create-update-5sn9p\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.022353 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57cae041-75e9-407f-8d18-3c0da11e7a73-operator-scripts\") pod \"watcher-0215-account-create-update-5sn9p\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.022584 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-operator-scripts\") pod \"watcher-db-create-m6s6v\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.023905 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-operator-scripts\") pod \"watcher-db-create-m6s6v\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.041652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jwk\" (UniqueName: \"kubernetes.io/projected/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-kube-api-access-t2jwk\") pod \"watcher-db-create-m6s6v\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.124280 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpjbp\" (UniqueName: \"kubernetes.io/projected/57cae041-75e9-407f-8d18-3c0da11e7a73-kube-api-access-rpjbp\") pod \"watcher-0215-account-create-update-5sn9p\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.124360 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57cae041-75e9-407f-8d18-3c0da11e7a73-operator-scripts\") pod \"watcher-0215-account-create-update-5sn9p\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.125303 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57cae041-75e9-407f-8d18-3c0da11e7a73-operator-scripts\") pod \"watcher-0215-account-create-update-5sn9p\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.142907 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpjbp\" (UniqueName: \"kubernetes.io/projected/57cae041-75e9-407f-8d18-3c0da11e7a73-kube-api-access-rpjbp\") pod \"watcher-0215-account-create-update-5sn9p\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.148973 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.164641 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.648058 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-m6s6v"] Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.692659 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-m6s6v" event={"ID":"ddaf68d2-3b2c-4293-a453-ecbeb31bea81","Type":"ContainerStarted","Data":"8e16eaf3484073dfc4672275f261538f5c681395411941a015d484ef7f45100a"} Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.696798 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerStarted","Data":"ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939"} Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.697060 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.717794 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-0215-account-create-update-5sn9p"] Mar 14 09:19:20 crc kubenswrapper[4956]: I0314 09:19:20.724219 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.409415695 podStartE2EDuration="5.724202661s" podCreationTimestamp="2026-03-14 09:19:15 +0000 UTC" firstStartedPulling="2026-03-14 09:19:16.593401517 +0000 UTC m=+1362.106093785" lastFinishedPulling="2026-03-14 09:19:19.908188483 +0000 UTC m=+1365.420880751" observedRunningTime="2026-03-14 09:19:20.715110844 +0000 UTC m=+1366.227803112" watchObservedRunningTime="2026-03-14 09:19:20.724202661 +0000 UTC m=+1366.236894929" Mar 14 09:19:21 crc kubenswrapper[4956]: I0314 09:19:21.706255 4956 generic.go:334] "Generic (PLEG): container finished" podID="ddaf68d2-3b2c-4293-a453-ecbeb31bea81" containerID="99d6e24ce25eefdbf16f95231746247c8f258c2075ab3b2488e756c82c682b8f" exitCode=0 Mar 14 09:19:21 crc kubenswrapper[4956]: I0314 09:19:21.706706 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-m6s6v" event={"ID":"ddaf68d2-3b2c-4293-a453-ecbeb31bea81","Type":"ContainerDied","Data":"99d6e24ce25eefdbf16f95231746247c8f258c2075ab3b2488e756c82c682b8f"} Mar 14 09:19:21 crc kubenswrapper[4956]: I0314 09:19:21.708866 4956 generic.go:334] "Generic (PLEG): container finished" podID="57cae041-75e9-407f-8d18-3c0da11e7a73" containerID="36ff6362dff416d20c94f92e2f02b441454d355f249f615a267c686e7d6fd339" exitCode=0 Mar 14 09:19:21 crc kubenswrapper[4956]: I0314 09:19:21.709957 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" event={"ID":"57cae041-75e9-407f-8d18-3c0da11e7a73","Type":"ContainerDied","Data":"36ff6362dff416d20c94f92e2f02b441454d355f249f615a267c686e7d6fd339"} Mar 14 09:19:21 crc kubenswrapper[4956]: I0314 09:19:21.709991 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" event={"ID":"57cae041-75e9-407f-8d18-3c0da11e7a73","Type":"ContainerStarted","Data":"1fd2ebfe6e5cc48587f8303509dc560ab4d015f6b2b3361c69be3f915a17adbc"} Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.122837 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.244574 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.288472 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57cae041-75e9-407f-8d18-3c0da11e7a73-operator-scripts\") pod \"57cae041-75e9-407f-8d18-3c0da11e7a73\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.288567 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpjbp\" (UniqueName: \"kubernetes.io/projected/57cae041-75e9-407f-8d18-3c0da11e7a73-kube-api-access-rpjbp\") pod \"57cae041-75e9-407f-8d18-3c0da11e7a73\" (UID: \"57cae041-75e9-407f-8d18-3c0da11e7a73\") " Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.288895 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cae041-75e9-407f-8d18-3c0da11e7a73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57cae041-75e9-407f-8d18-3c0da11e7a73" (UID: "57cae041-75e9-407f-8d18-3c0da11e7a73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.289412 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57cae041-75e9-407f-8d18-3c0da11e7a73-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.293685 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cae041-75e9-407f-8d18-3c0da11e7a73-kube-api-access-rpjbp" (OuterVolumeSpecName: "kube-api-access-rpjbp") pod "57cae041-75e9-407f-8d18-3c0da11e7a73" (UID: "57cae041-75e9-407f-8d18-3c0da11e7a73"). InnerVolumeSpecName "kube-api-access-rpjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.390476 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-operator-scripts\") pod \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.390665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2jwk\" (UniqueName: \"kubernetes.io/projected/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-kube-api-access-t2jwk\") pod \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\" (UID: \"ddaf68d2-3b2c-4293-a453-ecbeb31bea81\") " Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.391078 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpjbp\" (UniqueName: \"kubernetes.io/projected/57cae041-75e9-407f-8d18-3c0da11e7a73-kube-api-access-rpjbp\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.391106 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddaf68d2-3b2c-4293-a453-ecbeb31bea81" (UID: "ddaf68d2-3b2c-4293-a453-ecbeb31bea81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.393808 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-kube-api-access-t2jwk" (OuterVolumeSpecName: "kube-api-access-t2jwk") pod "ddaf68d2-3b2c-4293-a453-ecbeb31bea81" (UID: "ddaf68d2-3b2c-4293-a453-ecbeb31bea81"). InnerVolumeSpecName "kube-api-access-t2jwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.493004 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.493394 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2jwk\" (UniqueName: \"kubernetes.io/projected/ddaf68d2-3b2c-4293-a453-ecbeb31bea81-kube-api-access-t2jwk\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.730219 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-m6s6v" event={"ID":"ddaf68d2-3b2c-4293-a453-ecbeb31bea81","Type":"ContainerDied","Data":"8e16eaf3484073dfc4672275f261538f5c681395411941a015d484ef7f45100a"} Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.731110 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e16eaf3484073dfc4672275f261538f5c681395411941a015d484ef7f45100a" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.730241 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-m6s6v" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.733618 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" event={"ID":"57cae041-75e9-407f-8d18-3c0da11e7a73","Type":"ContainerDied","Data":"1fd2ebfe6e5cc48587f8303509dc560ab4d015f6b2b3361c69be3f915a17adbc"} Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.733808 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd2ebfe6e5cc48587f8303509dc560ab4d015f6b2b3361c69be3f915a17adbc" Mar 14 09:19:23 crc kubenswrapper[4956]: I0314 09:19:23.733706 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-0215-account-create-update-5sn9p" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.268200 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv"] Mar 14 09:19:25 crc kubenswrapper[4956]: E0314 09:19:25.268635 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddaf68d2-3b2c-4293-a453-ecbeb31bea81" containerName="mariadb-database-create" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.268649 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddaf68d2-3b2c-4293-a453-ecbeb31bea81" containerName="mariadb-database-create" Mar 14 09:19:25 crc kubenswrapper[4956]: E0314 09:19:25.268664 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57cae041-75e9-407f-8d18-3c0da11e7a73" containerName="mariadb-account-create-update" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.268673 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cae041-75e9-407f-8d18-3c0da11e7a73" containerName="mariadb-account-create-update" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.268863 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddaf68d2-3b2c-4293-a453-ecbeb31bea81" containerName="mariadb-database-create" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.268888 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57cae041-75e9-407f-8d18-3c0da11e7a73" containerName="mariadb-account-create-update" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.269527 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.272010 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.272335 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-z4sdl" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.283876 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv"] Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.422468 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6nw4\" (UniqueName: \"kubernetes.io/projected/a547039f-2d8f-45d4-8b32-05fa39f51b9c-kube-api-access-g6nw4\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.422528 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-db-sync-config-data\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.422678 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-config-data\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.422719 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.424238 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.424298 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.523818 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-config-data\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.524156 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.524205 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6nw4\" (UniqueName: \"kubernetes.io/projected/a547039f-2d8f-45d4-8b32-05fa39f51b9c-kube-api-access-g6nw4\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.524223 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-db-sync-config-data\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.528579 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-db-sync-config-data\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.529032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-config-data\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.531825 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.540524 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6nw4\" (UniqueName: \"kubernetes.io/projected/a547039f-2d8f-45d4-8b32-05fa39f51b9c-kube-api-access-g6nw4\") pod \"watcher-kuttl-db-sync-hfdcv\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:25 crc kubenswrapper[4956]: I0314 09:19:25.591820 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:26 crc kubenswrapper[4956]: I0314 09:19:26.043382 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv"] Mar 14 09:19:26 crc kubenswrapper[4956]: W0314 09:19:26.045054 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda547039f_2d8f_45d4_8b32_05fa39f51b9c.slice/crio-c27fe433f6d2a6b174eae15135c4fda6ae06f43c6e8df8ddfcfff60ba3b74c96 WatchSource:0}: Error finding container c27fe433f6d2a6b174eae15135c4fda6ae06f43c6e8df8ddfcfff60ba3b74c96: Status 404 returned error can't find the container with id c27fe433f6d2a6b174eae15135c4fda6ae06f43c6e8df8ddfcfff60ba3b74c96 Mar 14 09:19:26 crc kubenswrapper[4956]: I0314 09:19:26.067931 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Mar 14 09:19:26 crc kubenswrapper[4956]: I0314 09:19:26.761299 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" event={"ID":"a547039f-2d8f-45d4-8b32-05fa39f51b9c","Type":"ContainerStarted","Data":"c27fe433f6d2a6b174eae15135c4fda6ae06f43c6e8df8ddfcfff60ba3b74c96"} Mar 14 09:19:40 crc kubenswrapper[4956]: E0314 09:19:40.383078 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.153:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Mar 14 09:19:40 crc kubenswrapper[4956]: E0314 09:19:40.383565 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.153:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Mar 14 09:19:40 crc kubenswrapper[4956]: E0314 09:19:40.383683 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-kuttl-db-sync,Image:38.102.83.153:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6nw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-kuttl-db-sync-hfdcv_watcher-kuttl-default(a547039f-2d8f-45d4-8b32-05fa39f51b9c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:19:40 crc kubenswrapper[4956]: E0314 09:19:40.384871 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" podUID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" Mar 14 09:19:40 crc kubenswrapper[4956]: E0314 09:19:40.900459 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.153:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" podUID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" Mar 14 09:19:46 crc kubenswrapper[4956]: I0314 09:19:46.115707 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:19:51 crc kubenswrapper[4956]: I0314 09:19:51.990352 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" event={"ID":"a547039f-2d8f-45d4-8b32-05fa39f51b9c","Type":"ContainerStarted","Data":"1d98748227b182d14e254f302151164d84ce0ff4e070d3a589d2e88cec797a80"} Mar 14 09:19:52 crc kubenswrapper[4956]: I0314 09:19:52.011699 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" podStartSLOduration=1.7763645989999999 podStartE2EDuration="27.011681713s" podCreationTimestamp="2026-03-14 09:19:25 +0000 UTC" firstStartedPulling="2026-03-14 09:19:26.047056695 +0000 UTC m=+1371.559748963" lastFinishedPulling="2026-03-14 09:19:51.282373809 +0000 UTC m=+1396.795066077" observedRunningTime="2026-03-14 09:19:52.006319339 +0000 UTC m=+1397.519011597" watchObservedRunningTime="2026-03-14 09:19:52.011681713 +0000 UTC m=+1397.524373981" Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.016506 4956 generic.go:334] "Generic (PLEG): container finished" podID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" containerID="1d98748227b182d14e254f302151164d84ce0ff4e070d3a589d2e88cec797a80" exitCode=0 Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.016627 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" event={"ID":"a547039f-2d8f-45d4-8b32-05fa39f51b9c","Type":"ContainerDied","Data":"1d98748227b182d14e254f302151164d84ce0ff4e070d3a589d2e88cec797a80"} Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.423813 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.423885 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.423943 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.424605 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d51ed46fef8fdbd34e0f9aab241d72045408a6fd0c1f2415549d37d1cae23089"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:19:55 crc kubenswrapper[4956]: I0314 09:19:55.424680 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://d51ed46fef8fdbd34e0f9aab241d72045408a6fd0c1f2415549d37d1cae23089" gracePeriod=600 Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.031470 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="d51ed46fef8fdbd34e0f9aab241d72045408a6fd0c1f2415549d37d1cae23089" exitCode=0 Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.031778 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"d51ed46fef8fdbd34e0f9aab241d72045408a6fd0c1f2415549d37d1cae23089"} Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.031825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb"} Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.031844 4956 scope.go:117] "RemoveContainer" containerID="627cfd99357e3b66570faed20a1ce7ae2cc7c510054d839a6cb159952f60d6be" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.335213 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.445297 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6nw4\" (UniqueName: \"kubernetes.io/projected/a547039f-2d8f-45d4-8b32-05fa39f51b9c-kube-api-access-g6nw4\") pod \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.445367 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-config-data\") pod \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.445464 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-db-sync-config-data\") pod \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.445510 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-combined-ca-bundle\") pod \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\" (UID: \"a547039f-2d8f-45d4-8b32-05fa39f51b9c\") " Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.451145 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a547039f-2d8f-45d4-8b32-05fa39f51b9c" (UID: "a547039f-2d8f-45d4-8b32-05fa39f51b9c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.451208 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a547039f-2d8f-45d4-8b32-05fa39f51b9c-kube-api-access-g6nw4" (OuterVolumeSpecName: "kube-api-access-g6nw4") pod "a547039f-2d8f-45d4-8b32-05fa39f51b9c" (UID: "a547039f-2d8f-45d4-8b32-05fa39f51b9c"). InnerVolumeSpecName "kube-api-access-g6nw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.470609 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a547039f-2d8f-45d4-8b32-05fa39f51b9c" (UID: "a547039f-2d8f-45d4-8b32-05fa39f51b9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.485336 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-config-data" (OuterVolumeSpecName: "config-data") pod "a547039f-2d8f-45d4-8b32-05fa39f51b9c" (UID: "a547039f-2d8f-45d4-8b32-05fa39f51b9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.547009 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6nw4\" (UniqueName: \"kubernetes.io/projected/a547039f-2d8f-45d4-8b32-05fa39f51b9c-kube-api-access-g6nw4\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.547054 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.547068 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:56 crc kubenswrapper[4956]: I0314 09:19:56.547080 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a547039f-2d8f-45d4-8b32-05fa39f51b9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.043749 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" event={"ID":"a547039f-2d8f-45d4-8b32-05fa39f51b9c","Type":"ContainerDied","Data":"c27fe433f6d2a6b174eae15135c4fda6ae06f43c6e8df8ddfcfff60ba3b74c96"} Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.043786 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.043801 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27fe433f6d2a6b174eae15135c4fda6ae06f43c6e8df8ddfcfff60ba3b74c96" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.425457 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:19:57 crc kubenswrapper[4956]: E0314 09:19:57.426181 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" containerName="watcher-kuttl-db-sync" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.426206 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" containerName="watcher-kuttl-db-sync" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.426422 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" containerName="watcher-kuttl-db-sync" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.427120 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.430441 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.430794 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-z4sdl" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.435684 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.437402 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.443504 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.447154 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.448222 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.449609 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.458642 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.486563 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.513657 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.561792 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.561851 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d63e9247-2a6e-4255-ac39-a1aa02803da8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.561886 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxs2\" (UniqueName: \"kubernetes.io/projected/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-kube-api-access-kmxs2\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.561918 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.561953 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.561993 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f47c104-0d0a-4883-b925-0472823393b7-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562026 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562064 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562097 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wl8\" (UniqueName: \"kubernetes.io/projected/6f47c104-0d0a-4883-b925-0472823393b7-kube-api-access-65wl8\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562118 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562159 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562184 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562206 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.562237 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zft4\" (UniqueName: \"kubernetes.io/projected/d63e9247-2a6e-4255-ac39-a1aa02803da8-kube-api-access-5zft4\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.663972 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664032 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664061 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664088 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zft4\" (UniqueName: \"kubernetes.io/projected/d63e9247-2a6e-4255-ac39-a1aa02803da8-kube-api-access-5zft4\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664134 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d63e9247-2a6e-4255-ac39-a1aa02803da8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxs2\" (UniqueName: \"kubernetes.io/projected/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-kube-api-access-kmxs2\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664192 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664216 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664245 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f47c104-0d0a-4883-b925-0472823393b7-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664269 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664295 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664320 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wl8\" (UniqueName: \"kubernetes.io/projected/6f47c104-0d0a-4883-b925-0472823393b7-kube-api-access-65wl8\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664336 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664890 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f47c104-0d0a-4883-b925-0472823393b7-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.664894 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d63e9247-2a6e-4255-ac39-a1aa02803da8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.665220 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.668988 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.669130 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.670452 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.670853 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.671090 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.671401 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.671920 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.672200 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.685783 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zft4\" (UniqueName: \"kubernetes.io/projected/d63e9247-2a6e-4255-ac39-a1aa02803da8-kube-api-access-5zft4\") pod \"watcher-kuttl-applier-0\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.686453 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxs2\" (UniqueName: \"kubernetes.io/projected/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-kube-api-access-kmxs2\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.686769 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wl8\" (UniqueName: \"kubernetes.io/projected/6f47c104-0d0a-4883-b925-0472823393b7-kube-api-access-65wl8\") pod \"watcher-kuttl-api-0\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.750867 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.762741 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:57 crc kubenswrapper[4956]: I0314 09:19:57.774611 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:19:58 crc kubenswrapper[4956]: I0314 09:19:58.233681 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:19:58 crc kubenswrapper[4956]: W0314 09:19:58.237098 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f47c104_0d0a_4883_b925_0472823393b7.slice/crio-e63fcdf110089e373f93a5c295c7873bf894bc0259a7d6f56c6ba66346649d89 WatchSource:0}: Error finding container e63fcdf110089e373f93a5c295c7873bf894bc0259a7d6f56c6ba66346649d89: Status 404 returned error can't find the container with id e63fcdf110089e373f93a5c295c7873bf894bc0259a7d6f56c6ba66346649d89 Mar 14 09:19:58 crc kubenswrapper[4956]: I0314 09:19:58.278780 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:19:58 crc kubenswrapper[4956]: W0314 09:19:58.280124 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14c1f6f8_a23d_43a4_8b33_9c6aa8691ead.slice/crio-46fbf52f8dcc9256e007fb248878ef41a20140a1796f435286a5ac06f24064d8 WatchSource:0}: Error finding container 46fbf52f8dcc9256e007fb248878ef41a20140a1796f435286a5ac06f24064d8: Status 404 returned error can't find the container with id 46fbf52f8dcc9256e007fb248878ef41a20140a1796f435286a5ac06f24064d8 Mar 14 09:19:58 crc kubenswrapper[4956]: I0314 09:19:58.362307 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:19:58 crc kubenswrapper[4956]: W0314 09:19:58.366582 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd63e9247_2a6e_4255_ac39_a1aa02803da8.slice/crio-94d68e89d45e187cbd66c47867521c1f6819ce444d75692811b3fdaef3b6f581 WatchSource:0}: Error finding container 94d68e89d45e187cbd66c47867521c1f6819ce444d75692811b3fdaef3b6f581: Status 404 returned error can't find the container with id 94d68e89d45e187cbd66c47867521c1f6819ce444d75692811b3fdaef3b6f581 Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.072951 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead","Type":"ContainerStarted","Data":"46fbf52f8dcc9256e007fb248878ef41a20140a1796f435286a5ac06f24064d8"} Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.075468 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6f47c104-0d0a-4883-b925-0472823393b7","Type":"ContainerStarted","Data":"f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc"} Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.075827 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6f47c104-0d0a-4883-b925-0472823393b7","Type":"ContainerStarted","Data":"1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b"} Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.075886 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.075903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6f47c104-0d0a-4883-b925-0472823393b7","Type":"ContainerStarted","Data":"e63fcdf110089e373f93a5c295c7873bf894bc0259a7d6f56c6ba66346649d89"} Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.082452 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d63e9247-2a6e-4255-ac39-a1aa02803da8","Type":"ContainerStarted","Data":"94d68e89d45e187cbd66c47867521c1f6819ce444d75692811b3fdaef3b6f581"} Mar 14 09:19:59 crc kubenswrapper[4956]: I0314 09:19:59.122111 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.122083322 podStartE2EDuration="2.122083322s" podCreationTimestamp="2026-03-14 09:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:19:59.101473987 +0000 UTC m=+1404.614166255" watchObservedRunningTime="2026-03-14 09:19:59.122083322 +0000 UTC m=+1404.634775590" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.091290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d63e9247-2a6e-4255-ac39-a1aa02803da8","Type":"ContainerStarted","Data":"a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328"} Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.094631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead","Type":"ContainerStarted","Data":"4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504"} Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.125661 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.446996928 podStartE2EDuration="3.125641938s" podCreationTimestamp="2026-03-14 09:19:57 +0000 UTC" firstStartedPulling="2026-03-14 09:19:58.369190409 +0000 UTC m=+1403.881882667" lastFinishedPulling="2026-03-14 09:19:59.047835409 +0000 UTC m=+1404.560527677" observedRunningTime="2026-03-14 09:20:00.120186462 +0000 UTC m=+1405.632878770" watchObservedRunningTime="2026-03-14 09:20:00.125641938 +0000 UTC m=+1405.638334206" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.147432 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558000-7z52v"] Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.148711 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.150787 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.150884 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.152350 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.159239 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-7z52v"] Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.166974 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.40376978 podStartE2EDuration="3.166948499s" podCreationTimestamp="2026-03-14 09:19:57 +0000 UTC" firstStartedPulling="2026-03-14 09:19:58.282320592 +0000 UTC m=+1403.795012860" lastFinishedPulling="2026-03-14 09:19:59.045499291 +0000 UTC m=+1404.558191579" observedRunningTime="2026-03-14 09:20:00.14413806 +0000 UTC m=+1405.656830338" watchObservedRunningTime="2026-03-14 09:20:00.166948499 +0000 UTC m=+1405.679640767" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.225126 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/4e9098de-4e80-4b2f-860d-065fac0cc574-kube-api-access-bqtdr\") pod \"auto-csr-approver-29558000-7z52v\" (UID: \"4e9098de-4e80-4b2f-860d-065fac0cc574\") " pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.327160 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/4e9098de-4e80-4b2f-860d-065fac0cc574-kube-api-access-bqtdr\") pod \"auto-csr-approver-29558000-7z52v\" (UID: \"4e9098de-4e80-4b2f-860d-065fac0cc574\") " pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.353734 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/4e9098de-4e80-4b2f-860d-065fac0cc574-kube-api-access-bqtdr\") pod \"auto-csr-approver-29558000-7z52v\" (UID: \"4e9098de-4e80-4b2f-860d-065fac0cc574\") " pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.478449 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.628053 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k7hzj"] Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.632832 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.642281 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7hzj"] Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.736902 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58w96\" (UniqueName: \"kubernetes.io/projected/f7d7957f-94f0-40d4-9e51-845d6014b789-kube-api-access-58w96\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.737105 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-utilities\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.737232 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-catalog-content\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.839350 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58w96\" (UniqueName: \"kubernetes.io/projected/f7d7957f-94f0-40d4-9e51-845d6014b789-kube-api-access-58w96\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.839460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-utilities\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.839502 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-catalog-content\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.840032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-utilities\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.840073 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-catalog-content\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.858302 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58w96\" (UniqueName: \"kubernetes.io/projected/f7d7957f-94f0-40d4-9e51-845d6014b789-kube-api-access-58w96\") pod \"redhat-operators-k7hzj\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.961468 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:00 crc kubenswrapper[4956]: I0314 09:20:00.968559 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-7z52v"] Mar 14 09:20:00 crc kubenswrapper[4956]: W0314 09:20:00.978167 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e9098de_4e80_4b2f_860d_065fac0cc574.slice/crio-cddd592be79d0bc99f262ef1110c80020e5b6fc6c310030d7f1dc8d7a0c6cff8 WatchSource:0}: Error finding container cddd592be79d0bc99f262ef1110c80020e5b6fc6c310030d7f1dc8d7a0c6cff8: Status 404 returned error can't find the container with id cddd592be79d0bc99f262ef1110c80020e5b6fc6c310030d7f1dc8d7a0c6cff8 Mar 14 09:20:01 crc kubenswrapper[4956]: I0314 09:20:01.118430 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-7z52v" event={"ID":"4e9098de-4e80-4b2f-860d-065fac0cc574","Type":"ContainerStarted","Data":"cddd592be79d0bc99f262ef1110c80020e5b6fc6c310030d7f1dc8d7a0c6cff8"} Mar 14 09:20:01 crc kubenswrapper[4956]: I0314 09:20:01.413756 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7hzj"] Mar 14 09:20:01 crc kubenswrapper[4956]: I0314 09:20:01.414815 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:20:01 crc kubenswrapper[4956]: W0314 09:20:01.418746 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d7957f_94f0_40d4_9e51_845d6014b789.slice/crio-ab783c9318928a1ee23cd772fe5a31ce3941efe12713f1b332cd3ce8d0847279 WatchSource:0}: Error finding container ab783c9318928a1ee23cd772fe5a31ce3941efe12713f1b332cd3ce8d0847279: Status 404 returned error can't find the container with id ab783c9318928a1ee23cd772fe5a31ce3941efe12713f1b332cd3ce8d0847279 Mar 14 09:20:02 crc kubenswrapper[4956]: I0314 09:20:02.163817 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerID="14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58" exitCode=0 Mar 14 09:20:02 crc kubenswrapper[4956]: I0314 09:20:02.163862 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerDied","Data":"14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58"} Mar 14 09:20:02 crc kubenswrapper[4956]: I0314 09:20:02.163890 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerStarted","Data":"ab783c9318928a1ee23cd772fe5a31ce3941efe12713f1b332cd3ce8d0847279"} Mar 14 09:20:02 crc kubenswrapper[4956]: I0314 09:20:02.763153 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:20:02 crc kubenswrapper[4956]: I0314 09:20:02.774959 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:20:03 crc kubenswrapper[4956]: I0314 09:20:03.181195 4956 generic.go:334] "Generic (PLEG): container finished" podID="4e9098de-4e80-4b2f-860d-065fac0cc574" containerID="19d19b6f9bdb20d58e7b55d660211ff8fb28b46014f7ea8bf2cac697e1d7cecf" exitCode=0 Mar 14 09:20:03 crc kubenswrapper[4956]: I0314 09:20:03.181257 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-7z52v" event={"ID":"4e9098de-4e80-4b2f-860d-065fac0cc574","Type":"ContainerDied","Data":"19d19b6f9bdb20d58e7b55d660211ff8fb28b46014f7ea8bf2cac697e1d7cecf"} Mar 14 09:20:04 crc kubenswrapper[4956]: I0314 09:20:04.190382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerStarted","Data":"4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758"} Mar 14 09:20:04 crc kubenswrapper[4956]: I0314 09:20:04.658780 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:04 crc kubenswrapper[4956]: I0314 09:20:04.721601 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/4e9098de-4e80-4b2f-860d-065fac0cc574-kube-api-access-bqtdr\") pod \"4e9098de-4e80-4b2f-860d-065fac0cc574\" (UID: \"4e9098de-4e80-4b2f-860d-065fac0cc574\") " Mar 14 09:20:04 crc kubenswrapper[4956]: I0314 09:20:04.727510 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9098de-4e80-4b2f-860d-065fac0cc574-kube-api-access-bqtdr" (OuterVolumeSpecName: "kube-api-access-bqtdr") pod "4e9098de-4e80-4b2f-860d-065fac0cc574" (UID: "4e9098de-4e80-4b2f-860d-065fac0cc574"). InnerVolumeSpecName "kube-api-access-bqtdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:04 crc kubenswrapper[4956]: I0314 09:20:04.823854 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqtdr\" (UniqueName: \"kubernetes.io/projected/4e9098de-4e80-4b2f-860d-065fac0cc574-kube-api-access-bqtdr\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:05 crc kubenswrapper[4956]: I0314 09:20:05.201716 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-7z52v" event={"ID":"4e9098de-4e80-4b2f-860d-065fac0cc574","Type":"ContainerDied","Data":"cddd592be79d0bc99f262ef1110c80020e5b6fc6c310030d7f1dc8d7a0c6cff8"} Mar 14 09:20:05 crc kubenswrapper[4956]: I0314 09:20:05.201775 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cddd592be79d0bc99f262ef1110c80020e5b6fc6c310030d7f1dc8d7a0c6cff8" Mar 14 09:20:05 crc kubenswrapper[4956]: I0314 09:20:05.201804 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-7z52v" Mar 14 09:20:05 crc kubenswrapper[4956]: I0314 09:20:05.741035 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-zvc72"] Mar 14 09:20:05 crc kubenswrapper[4956]: I0314 09:20:05.752895 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-zvc72"] Mar 14 09:20:06 crc kubenswrapper[4956]: I0314 09:20:06.213676 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerID="4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758" exitCode=0 Mar 14 09:20:06 crc kubenswrapper[4956]: I0314 09:20:06.213724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerDied","Data":"4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758"} Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.220563 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414e50d4-0a21-488c-b82c-95e9efd119cb" path="/var/lib/kubelet/pods/414e50d4-0a21-488c-b82c-95e9efd119cb/volumes" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.224316 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerStarted","Data":"64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8"} Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.248499 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k7hzj" podStartSLOduration=2.77087492 podStartE2EDuration="7.248455846s" podCreationTimestamp="2026-03-14 09:20:00 +0000 UTC" firstStartedPulling="2026-03-14 09:20:02.170466161 +0000 UTC m=+1407.683158429" lastFinishedPulling="2026-03-14 09:20:06.648047087 +0000 UTC m=+1412.160739355" observedRunningTime="2026-03-14 09:20:07.242909978 +0000 UTC m=+1412.755602246" watchObservedRunningTime="2026-03-14 09:20:07.248455846 +0000 UTC m=+1412.761148124" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.751557 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.763344 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.767361 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.775647 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.783266 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:20:07 crc kubenswrapper[4956]: I0314 09:20:07.802395 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:20:08 crc kubenswrapper[4956]: I0314 09:20:08.231792 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:20:08 crc kubenswrapper[4956]: I0314 09:20:08.237058 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:20:08 crc kubenswrapper[4956]: I0314 09:20:08.257222 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:20:08 crc kubenswrapper[4956]: I0314 09:20:08.261526 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.840820 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.842295 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-central-agent" containerID="cri-o://96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6" gracePeriod=30 Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.842353 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="proxy-httpd" containerID="cri-o://ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939" gracePeriod=30 Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.842450 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-notification-agent" containerID="cri-o://92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef" gracePeriod=30 Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.842391 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="sg-core" containerID="cri-o://155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e" gracePeriod=30 Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.962548 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:10 crc kubenswrapper[4956]: I0314 09:20:10.962598 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:11 crc kubenswrapper[4956]: I0314 09:20:11.256430 4956 generic.go:334] "Generic (PLEG): container finished" podID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerID="ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939" exitCode=0 Mar 14 09:20:11 crc kubenswrapper[4956]: I0314 09:20:11.256462 4956 generic.go:334] "Generic (PLEG): container finished" podID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerID="155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e" exitCode=2 Mar 14 09:20:11 crc kubenswrapper[4956]: I0314 09:20:11.256495 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerDied","Data":"ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939"} Mar 14 09:20:11 crc kubenswrapper[4956]: I0314 09:20:11.256520 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerDied","Data":"155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e"} Mar 14 09:20:12 crc kubenswrapper[4956]: I0314 09:20:12.011880 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k7hzj" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="registry-server" probeResult="failure" output=< Mar 14 09:20:12 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:20:12 crc kubenswrapper[4956]: > Mar 14 09:20:12 crc kubenswrapper[4956]: I0314 09:20:12.266730 4956 generic.go:334] "Generic (PLEG): container finished" podID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerID="96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6" exitCode=0 Mar 14 09:20:12 crc kubenswrapper[4956]: I0314 09:20:12.266839 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerDied","Data":"96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6"} Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.037196 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.111690 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-combined-ca-bundle\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.111804 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-log-httpd\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.111834 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-run-httpd\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.111878 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-config-data\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.111950 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m949\" (UniqueName: \"kubernetes.io/projected/b68eaedb-1389-45e7-b145-f8ad57fb4aca-kube-api-access-7m949\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.111990 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-sg-core-conf-yaml\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.112016 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-scripts\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.112043 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-ceilometer-tls-certs\") pod \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\" (UID: \"b68eaedb-1389-45e7-b145-f8ad57fb4aca\") " Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.112930 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.113148 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.117056 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-scripts" (OuterVolumeSpecName: "scripts") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.117104 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68eaedb-1389-45e7-b145-f8ad57fb4aca-kube-api-access-7m949" (OuterVolumeSpecName: "kube-api-access-7m949") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "kube-api-access-7m949". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.136195 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.156157 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.176460 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.192748 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-config-data" (OuterVolumeSpecName: "config-data") pod "b68eaedb-1389-45e7-b145-f8ad57fb4aca" (UID: "b68eaedb-1389-45e7-b145-f8ad57fb4aca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214838 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214882 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m949\" (UniqueName: \"kubernetes.io/projected/b68eaedb-1389-45e7-b145-f8ad57fb4aca-kube-api-access-7m949\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214896 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214907 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214920 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214931 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68eaedb-1389-45e7-b145-f8ad57fb4aca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214942 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.214952 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b68eaedb-1389-45e7-b145-f8ad57fb4aca-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.331515 4956 generic.go:334] "Generic (PLEG): container finished" podID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerID="92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef" exitCode=0 Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.331570 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerDied","Data":"92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef"} Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.331608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b68eaedb-1389-45e7-b145-f8ad57fb4aca","Type":"ContainerDied","Data":"32bf1d3af738cc20d14fcfa67288b3889cc3e695d5cf13129a2c42e3438b9b30"} Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.331632 4956 scope.go:117] "RemoveContainer" containerID="ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.333781 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.361223 4956 scope.go:117] "RemoveContainer" containerID="155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.372620 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.390515 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.400773 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.401281 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="proxy-httpd" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401306 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="proxy-httpd" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.401320 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9098de-4e80-4b2f-860d-065fac0cc574" containerName="oc" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401328 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9098de-4e80-4b2f-860d-065fac0cc574" containerName="oc" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.401345 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-central-agent" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401351 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-central-agent" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.401369 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-notification-agent" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401375 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-notification-agent" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.401397 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="sg-core" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401403 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="sg-core" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401584 4956 scope.go:117] "RemoveContainer" containerID="92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401593 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-central-agent" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401760 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="proxy-httpd" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401788 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="ceilometer-notification-agent" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401809 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9098de-4e80-4b2f-860d-065fac0cc574" containerName="oc" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.401843 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" containerName="sg-core" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.404089 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.407321 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.407605 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.408074 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.429084 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.437842 4956 scope.go:117] "RemoveContainer" containerID="96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.455550 4956 scope.go:117] "RemoveContainer" containerID="ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.455938 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939\": container with ID starting with ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939 not found: ID does not exist" containerID="ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.456111 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939"} err="failed to get container status \"ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939\": rpc error: code = NotFound desc = could not find container \"ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939\": container with ID starting with ee0f19568bbd227f40637bff374f3469b8cbf29bcd2642a239945e6454e64939 not found: ID does not exist" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.456144 4956 scope.go:117] "RemoveContainer" containerID="155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.457167 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e\": container with ID starting with 155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e not found: ID does not exist" containerID="155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.457225 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e"} err="failed to get container status \"155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e\": rpc error: code = NotFound desc = could not find container \"155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e\": container with ID starting with 155cf0a7b6a3587aeb64da5910961603f8c25fd91bf9f46e063b1918128bd59e not found: ID does not exist" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.457254 4956 scope.go:117] "RemoveContainer" containerID="92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.457568 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef\": container with ID starting with 92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef not found: ID does not exist" containerID="92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.457593 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef"} err="failed to get container status \"92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef\": rpc error: code = NotFound desc = could not find container \"92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef\": container with ID starting with 92d92aaa9fae56c406d0a41f173a0f421d0b742d3b1c144bfb7037445f52d8ef not found: ID does not exist" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.457606 4956 scope.go:117] "RemoveContainer" containerID="96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6" Mar 14 09:20:16 crc kubenswrapper[4956]: E0314 09:20:16.457863 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6\": container with ID starting with 96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6 not found: ID does not exist" containerID="96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.457885 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6"} err="failed to get container status \"96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6\": rpc error: code = NotFound desc = could not find container \"96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6\": container with ID starting with 96b111767c44fb930d0722254c48ef09c32fd36be28173dafc6e4d2660d3f6a6 not found: ID does not exist" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.523612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.523723 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-log-httpd\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.523769 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.523817 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-scripts\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.523863 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-run-httpd\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.523924 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-config-data\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.524078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.524338 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8cq5\" (UniqueName: \"kubernetes.io/projected/216f8c56-27d3-42da-9559-d58862c4f84a-kube-api-access-d8cq5\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.625771 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8cq5\" (UniqueName: \"kubernetes.io/projected/216f8c56-27d3-42da-9559-d58862c4f84a-kube-api-access-d8cq5\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.625861 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.625917 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-log-httpd\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.625934 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.625967 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-scripts\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.625997 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-run-httpd\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.626038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-config-data\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.626053 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.627875 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-log-httpd\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.627892 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-run-httpd\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.630981 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-scripts\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.632192 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.632718 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-config-data\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.632815 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.632937 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.648922 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8cq5\" (UniqueName: \"kubernetes.io/projected/216f8c56-27d3-42da-9559-d58862c4f84a-kube-api-access-d8cq5\") pod \"ceilometer-0\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:16 crc kubenswrapper[4956]: I0314 09:20:16.743243 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:17 crc kubenswrapper[4956]: I0314 09:20:17.207369 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:20:17 crc kubenswrapper[4956]: I0314 09:20:17.219819 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68eaedb-1389-45e7-b145-f8ad57fb4aca" path="/var/lib/kubelet/pods/b68eaedb-1389-45e7-b145-f8ad57fb4aca/volumes" Mar 14 09:20:17 crc kubenswrapper[4956]: I0314 09:20:17.345039 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerStarted","Data":"c890ef2d604b8552f9c50686fc70350b5e5ef4b34ea53f8eda4f516326bdef58"} Mar 14 09:20:18 crc kubenswrapper[4956]: I0314 09:20:18.353870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerStarted","Data":"a0f8aea5278cb520a8596eae7eae982cd216c038cc00ecc5ddcd3e35fa2a3c86"} Mar 14 09:20:19 crc kubenswrapper[4956]: I0314 09:20:19.364495 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerStarted","Data":"aba7acd55c2c4b35fb09bbd85eaf66602b8293d09926544e38c03ab64a66c9e7"} Mar 14 09:20:20 crc kubenswrapper[4956]: I0314 09:20:20.372806 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerStarted","Data":"09aeb4bfbce8f0d48ce429c9cc00a5331e7d4a59df31b198b658c3f4eaa4cd69"} Mar 14 09:20:21 crc kubenswrapper[4956]: I0314 09:20:21.011113 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:21 crc kubenswrapper[4956]: I0314 09:20:21.055041 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:22 crc kubenswrapper[4956]: I0314 09:20:22.412135 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerStarted","Data":"82b1c08c4310908e157d7b46968791121fb4a4f7244aba0bfe85914f96aeeed9"} Mar 14 09:20:22 crc kubenswrapper[4956]: I0314 09:20:22.413189 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:22 crc kubenswrapper[4956]: I0314 09:20:22.441834 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.407531473 podStartE2EDuration="6.441812726s" podCreationTimestamp="2026-03-14 09:20:16 +0000 UTC" firstStartedPulling="2026-03-14 09:20:17.213493432 +0000 UTC m=+1422.726185700" lastFinishedPulling="2026-03-14 09:20:21.247774675 +0000 UTC m=+1426.760466953" observedRunningTime="2026-03-14 09:20:22.436688688 +0000 UTC m=+1427.949380956" watchObservedRunningTime="2026-03-14 09:20:22.441812726 +0000 UTC m=+1427.954504994" Mar 14 09:20:24 crc kubenswrapper[4956]: I0314 09:20:24.805646 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k7hzj"] Mar 14 09:20:24 crc kubenswrapper[4956]: I0314 09:20:24.806227 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k7hzj" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="registry-server" containerID="cri-o://64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8" gracePeriod=2 Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.299911 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.373857 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58w96\" (UniqueName: \"kubernetes.io/projected/f7d7957f-94f0-40d4-9e51-845d6014b789-kube-api-access-58w96\") pod \"f7d7957f-94f0-40d4-9e51-845d6014b789\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.374014 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-catalog-content\") pod \"f7d7957f-94f0-40d4-9e51-845d6014b789\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.374093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-utilities\") pod \"f7d7957f-94f0-40d4-9e51-845d6014b789\" (UID: \"f7d7957f-94f0-40d4-9e51-845d6014b789\") " Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.374937 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-utilities" (OuterVolumeSpecName: "utilities") pod "f7d7957f-94f0-40d4-9e51-845d6014b789" (UID: "f7d7957f-94f0-40d4-9e51-845d6014b789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.385763 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d7957f-94f0-40d4-9e51-845d6014b789-kube-api-access-58w96" (OuterVolumeSpecName: "kube-api-access-58w96") pod "f7d7957f-94f0-40d4-9e51-845d6014b789" (UID: "f7d7957f-94f0-40d4-9e51-845d6014b789"). InnerVolumeSpecName "kube-api-access-58w96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.434631 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerID="64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8" exitCode=0 Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.434896 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerDied","Data":"64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8"} Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.434980 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7hzj" event={"ID":"f7d7957f-94f0-40d4-9e51-845d6014b789","Type":"ContainerDied","Data":"ab783c9318928a1ee23cd772fe5a31ce3941efe12713f1b332cd3ce8d0847279"} Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.435049 4956 scope.go:117] "RemoveContainer" containerID="64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.435228 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7hzj" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.457806 4956 scope.go:117] "RemoveContainer" containerID="4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.476072 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58w96\" (UniqueName: \"kubernetes.io/projected/f7d7957f-94f0-40d4-9e51-845d6014b789-kube-api-access-58w96\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.476108 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.494894 4956 scope.go:117] "RemoveContainer" containerID="14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.502924 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7d7957f-94f0-40d4-9e51-845d6014b789" (UID: "f7d7957f-94f0-40d4-9e51-845d6014b789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.513800 4956 scope.go:117] "RemoveContainer" containerID="64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8" Mar 14 09:20:25 crc kubenswrapper[4956]: E0314 09:20:25.514204 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8\": container with ID starting with 64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8 not found: ID does not exist" containerID="64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.514253 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8"} err="failed to get container status \"64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8\": rpc error: code = NotFound desc = could not find container \"64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8\": container with ID starting with 64a23c18ea4634761992c109263b4934e543208daab85272ae54652e7a955ca8 not found: ID does not exist" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.514282 4956 scope.go:117] "RemoveContainer" containerID="4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758" Mar 14 09:20:25 crc kubenswrapper[4956]: E0314 09:20:25.517004 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758\": container with ID starting with 4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758 not found: ID does not exist" containerID="4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.517033 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758"} err="failed to get container status \"4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758\": rpc error: code = NotFound desc = could not find container \"4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758\": container with ID starting with 4de93f3ad6c7fcd147d5e40a0827245b631faad7c691053c9e7b3d1b289a4758 not found: ID does not exist" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.517054 4956 scope.go:117] "RemoveContainer" containerID="14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58" Mar 14 09:20:25 crc kubenswrapper[4956]: E0314 09:20:25.517482 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58\": container with ID starting with 14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58 not found: ID does not exist" containerID="14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.517529 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58"} err="failed to get container status \"14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58\": rpc error: code = NotFound desc = could not find container \"14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58\": container with ID starting with 14697ac81ed430507d447b0048f383a8f1c974100dfa7e941461cc8637d2ef58 not found: ID does not exist" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.577400 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d7957f-94f0-40d4-9e51-845d6014b789-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.765248 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k7hzj"] Mar 14 09:20:25 crc kubenswrapper[4956]: I0314 09:20:25.771254 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k7hzj"] Mar 14 09:20:27 crc kubenswrapper[4956]: I0314 09:20:27.222462 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" path="/var/lib/kubelet/pods/f7d7957f-94f0-40d4-9e51-845d6014b789/volumes" Mar 14 09:20:46 crc kubenswrapper[4956]: I0314 09:20:46.751154 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:20:54 crc kubenswrapper[4956]: I0314 09:20:54.092601 4956 scope.go:117] "RemoveContainer" containerID="4e2c2598dc3408c30bd4ddb0a6cf96b5e84167b0b935a8e931ac0c0d3f706d36" Mar 14 09:21:55 crc kubenswrapper[4956]: I0314 09:21:55.423790 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:21:55 crc kubenswrapper[4956]: I0314 09:21:55.424502 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.148019 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558002-9zjfh"] Mar 14 09:22:00 crc kubenswrapper[4956]: E0314 09:22:00.149082 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="extract-utilities" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.149096 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="extract-utilities" Mar 14 09:22:00 crc kubenswrapper[4956]: E0314 09:22:00.149120 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.149131 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[4956]: E0314 09:22:00.149154 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="extract-content" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.149160 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="extract-content" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.149323 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d7957f-94f0-40d4-9e51-845d6014b789" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.150027 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.159761 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.159918 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.159987 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.164894 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-9zjfh"] Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.254174 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmjn\" (UniqueName: \"kubernetes.io/projected/b915a811-e123-4d41-a773-f1610198bb90-kube-api-access-flmjn\") pod \"auto-csr-approver-29558002-9zjfh\" (UID: \"b915a811-e123-4d41-a773-f1610198bb90\") " pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.356224 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmjn\" (UniqueName: \"kubernetes.io/projected/b915a811-e123-4d41-a773-f1610198bb90-kube-api-access-flmjn\") pod \"auto-csr-approver-29558002-9zjfh\" (UID: \"b915a811-e123-4d41-a773-f1610198bb90\") " pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.376973 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmjn\" (UniqueName: \"kubernetes.io/projected/b915a811-e123-4d41-a773-f1610198bb90-kube-api-access-flmjn\") pod \"auto-csr-approver-29558002-9zjfh\" (UID: \"b915a811-e123-4d41-a773-f1610198bb90\") " pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.475851 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.944349 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-9zjfh"] Mar 14 09:22:00 crc kubenswrapper[4956]: I0314 09:22:00.945252 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:22:01 crc kubenswrapper[4956]: I0314 09:22:01.712084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" event={"ID":"b915a811-e123-4d41-a773-f1610198bb90","Type":"ContainerStarted","Data":"ca6daa957804bf77fc81df213398834adfefa8401e5926e10392e2e6f35f32cf"} Mar 14 09:22:02 crc kubenswrapper[4956]: I0314 09:22:02.724295 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" event={"ID":"b915a811-e123-4d41-a773-f1610198bb90","Type":"ContainerStarted","Data":"f3950bca3977a4c804cecde459e8d39654c6e790ad12f6dcaa7d9545ae4b2406"} Mar 14 09:22:02 crc kubenswrapper[4956]: I0314 09:22:02.745101 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" podStartSLOduration=1.464468429 podStartE2EDuration="2.745074557s" podCreationTimestamp="2026-03-14 09:22:00 +0000 UTC" firstStartedPulling="2026-03-14 09:22:00.944970664 +0000 UTC m=+1526.457662932" lastFinishedPulling="2026-03-14 09:22:02.225576792 +0000 UTC m=+1527.738269060" observedRunningTime="2026-03-14 09:22:02.737758184 +0000 UTC m=+1528.250450462" watchObservedRunningTime="2026-03-14 09:22:02.745074557 +0000 UTC m=+1528.257766825" Mar 14 09:22:03 crc kubenswrapper[4956]: I0314 09:22:03.834644 4956 generic.go:334] "Generic (PLEG): container finished" podID="b915a811-e123-4d41-a773-f1610198bb90" containerID="f3950bca3977a4c804cecde459e8d39654c6e790ad12f6dcaa7d9545ae4b2406" exitCode=0 Mar 14 09:22:03 crc kubenswrapper[4956]: I0314 09:22:03.834775 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" event={"ID":"b915a811-e123-4d41-a773-f1610198bb90","Type":"ContainerDied","Data":"f3950bca3977a4c804cecde459e8d39654c6e790ad12f6dcaa7d9545ae4b2406"} Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.148952 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.235823 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmjn\" (UniqueName: \"kubernetes.io/projected/b915a811-e123-4d41-a773-f1610198bb90-kube-api-access-flmjn\") pod \"b915a811-e123-4d41-a773-f1610198bb90\" (UID: \"b915a811-e123-4d41-a773-f1610198bb90\") " Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.240940 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b915a811-e123-4d41-a773-f1610198bb90-kube-api-access-flmjn" (OuterVolumeSpecName: "kube-api-access-flmjn") pod "b915a811-e123-4d41-a773-f1610198bb90" (UID: "b915a811-e123-4d41-a773-f1610198bb90"). InnerVolumeSpecName "kube-api-access-flmjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.338207 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flmjn\" (UniqueName: \"kubernetes.io/projected/b915a811-e123-4d41-a773-f1610198bb90-kube-api-access-flmjn\") on node \"crc\" DevicePath \"\"" Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.817565 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-wjwrm"] Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.825742 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-wjwrm"] Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.852241 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" event={"ID":"b915a811-e123-4d41-a773-f1610198bb90","Type":"ContainerDied","Data":"ca6daa957804bf77fc81df213398834adfefa8401e5926e10392e2e6f35f32cf"} Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.852585 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca6daa957804bf77fc81df213398834adfefa8401e5926e10392e2e6f35f32cf" Mar 14 09:22:05 crc kubenswrapper[4956]: I0314 09:22:05.852319 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-9zjfh" Mar 14 09:22:07 crc kubenswrapper[4956]: I0314 09:22:07.223898 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3305c86-b3b7-4e20-8b6c-8cb514addf0e" path="/var/lib/kubelet/pods/f3305c86-b3b7-4e20-8b6c-8cb514addf0e/volumes" Mar 14 09:22:25 crc kubenswrapper[4956]: I0314 09:22:25.424191 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:22:25 crc kubenswrapper[4956]: I0314 09:22:25.424844 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.015240 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4n2nb"] Mar 14 09:22:52 crc kubenswrapper[4956]: E0314 09:22:52.016127 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b915a811-e123-4d41-a773-f1610198bb90" containerName="oc" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.016140 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b915a811-e123-4d41-a773-f1610198bb90" containerName="oc" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.016344 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b915a811-e123-4d41-a773-f1610198bb90" containerName="oc" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.017559 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.026728 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n2nb"] Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.182778 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkmrm\" (UniqueName: \"kubernetes.io/projected/5be1551e-d8a2-4dda-8b7b-c75433011d34-kube-api-access-mkmrm\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.182918 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-utilities\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.183023 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-catalog-content\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.284231 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-utilities\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.284327 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-catalog-content\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.284411 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkmrm\" (UniqueName: \"kubernetes.io/projected/5be1551e-d8a2-4dda-8b7b-c75433011d34-kube-api-access-mkmrm\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.284884 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-utilities\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.284939 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-catalog-content\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.304111 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkmrm\" (UniqueName: \"kubernetes.io/projected/5be1551e-d8a2-4dda-8b7b-c75433011d34-kube-api-access-mkmrm\") pod \"redhat-marketplace-4n2nb\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.336016 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:22:52 crc kubenswrapper[4956]: I0314 09:22:52.790407 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n2nb"] Mar 14 09:22:53 crc kubenswrapper[4956]: I0314 09:22:53.220369 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n2nb" event={"ID":"5be1551e-d8a2-4dda-8b7b-c75433011d34","Type":"ContainerStarted","Data":"24f5bee3b41653c1f1222c5919f07dc95119a6f8ca19f8dbb534f806b2f56968"} Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.214558 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w4t6l"] Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.216766 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.230914 4956 generic.go:334] "Generic (PLEG): container finished" podID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerID="05080a6e8e843e38dc819147ea7415194864da09811af823b1139522ba4f3c68" exitCode=0 Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.230968 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n2nb" event={"ID":"5be1551e-d8a2-4dda-8b7b-c75433011d34","Type":"ContainerDied","Data":"05080a6e8e843e38dc819147ea7415194864da09811af823b1139522ba4f3c68"} Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.237024 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w4t6l"] Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.238184 4956 scope.go:117] "RemoveContainer" containerID="f1269729d86b8766b2385e57e3d5b58b97745a83bc93d518d4e50c55ad709f1a" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.317382 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-catalog-content\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.317449 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnwv\" (UniqueName: \"kubernetes.io/projected/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-kube-api-access-9jnwv\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.317659 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-utilities\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.419439 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-utilities\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.419607 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-catalog-content\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.419632 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnwv\" (UniqueName: \"kubernetes.io/projected/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-kube-api-access-9jnwv\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.420033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-utilities\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.420178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-catalog-content\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.441441 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnwv\" (UniqueName: \"kubernetes.io/projected/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-kube-api-access-9jnwv\") pod \"community-operators-w4t6l\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:54 crc kubenswrapper[4956]: I0314 09:22:54.548387 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.033176 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w4t6l"] Mar 14 09:22:55 crc kubenswrapper[4956]: W0314 09:22:55.034863 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a7bea9_8855_43a0_b8b1_f7354c53ac7b.slice/crio-cd7449cca255672339aa0fd0a1eac1fe4b7f43b85705be8f17a2cfc19c86b432 WatchSource:0}: Error finding container cd7449cca255672339aa0fd0a1eac1fe4b7f43b85705be8f17a2cfc19c86b432: Status 404 returned error can't find the container with id cd7449cca255672339aa0fd0a1eac1fe4b7f43b85705be8f17a2cfc19c86b432 Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.238728 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4t6l" event={"ID":"42a7bea9-8855-43a0-b8b1-f7354c53ac7b","Type":"ContainerStarted","Data":"cd7449cca255672339aa0fd0a1eac1fe4b7f43b85705be8f17a2cfc19c86b432"} Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.423367 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.423422 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.423465 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.424280 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:22:55 crc kubenswrapper[4956]: I0314 09:22:55.424362 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" gracePeriod=600 Mar 14 09:22:56 crc kubenswrapper[4956]: E0314 09:22:56.213643 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:22:56 crc kubenswrapper[4956]: I0314 09:22:56.248259 4956 generic.go:334] "Generic (PLEG): container finished" podID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerID="8ff0bcf41ea10277a9b293a93aed8b45c0e4e25fe2f11ebac891cda612c6b08d" exitCode=0 Mar 14 09:22:56 crc kubenswrapper[4956]: I0314 09:22:56.248562 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4t6l" event={"ID":"42a7bea9-8855-43a0-b8b1-f7354c53ac7b","Type":"ContainerDied","Data":"8ff0bcf41ea10277a9b293a93aed8b45c0e4e25fe2f11ebac891cda612c6b08d"} Mar 14 09:22:56 crc kubenswrapper[4956]: I0314 09:22:56.251408 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" exitCode=0 Mar 14 09:22:56 crc kubenswrapper[4956]: I0314 09:22:56.251458 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb"} Mar 14 09:22:56 crc kubenswrapper[4956]: I0314 09:22:56.251600 4956 scope.go:117] "RemoveContainer" containerID="d51ed46fef8fdbd34e0f9aab241d72045408a6fd0c1f2415549d37d1cae23089" Mar 14 09:22:56 crc kubenswrapper[4956]: I0314 09:22:56.252182 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:22:56 crc kubenswrapper[4956]: E0314 09:22:56.252419 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:22:57 crc kubenswrapper[4956]: I0314 09:22:57.264451 4956 generic.go:334] "Generic (PLEG): container finished" podID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerID="5edc1be9f7e3c9ac8e95db305f6f7632f419f641816dfcb4fda77699741f2cb7" exitCode=0 Mar 14 09:22:57 crc kubenswrapper[4956]: I0314 09:22:57.264518 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n2nb" event={"ID":"5be1551e-d8a2-4dda-8b7b-c75433011d34","Type":"ContainerDied","Data":"5edc1be9f7e3c9ac8e95db305f6f7632f419f641816dfcb4fda77699741f2cb7"} Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.016985 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gnvsv"] Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.019509 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.031334 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnvsv"] Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.211021 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-catalog-content\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.211139 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-utilities\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.211186 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t28gm\" (UniqueName: \"kubernetes.io/projected/577815e3-7633-4eb6-8725-5de6ee7cafc0-kube-api-access-t28gm\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.312540 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-utilities\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.313053 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-utilities\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.313222 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t28gm\" (UniqueName: \"kubernetes.io/projected/577815e3-7633-4eb6-8725-5de6ee7cafc0-kube-api-access-t28gm\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.313329 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-catalog-content\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.313643 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-catalog-content\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.334232 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t28gm\" (UniqueName: \"kubernetes.io/projected/577815e3-7633-4eb6-8725-5de6ee7cafc0-kube-api-access-t28gm\") pod \"certified-operators-gnvsv\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:22:59 crc kubenswrapper[4956]: I0314 09:22:59.340646 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:00 crc kubenswrapper[4956]: W0314 09:23:00.070515 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577815e3_7633_4eb6_8725_5de6ee7cafc0.slice/crio-e1d2dc1686497efd6e24901efbd33c07027329305b90f5eb9331a7583d72b678 WatchSource:0}: Error finding container e1d2dc1686497efd6e24901efbd33c07027329305b90f5eb9331a7583d72b678: Status 404 returned error can't find the container with id e1d2dc1686497efd6e24901efbd33c07027329305b90f5eb9331a7583d72b678 Mar 14 09:23:00 crc kubenswrapper[4956]: I0314 09:23:00.071621 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnvsv"] Mar 14 09:23:00 crc kubenswrapper[4956]: I0314 09:23:00.289587 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n2nb" event={"ID":"5be1551e-d8a2-4dda-8b7b-c75433011d34","Type":"ContainerStarted","Data":"51b044d5bb7dcaecbae63912ee1ef1a79c23be633b75189474c7a769d4f0ec08"} Mar 14 09:23:00 crc kubenswrapper[4956]: I0314 09:23:00.298014 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvsv" event={"ID":"577815e3-7633-4eb6-8725-5de6ee7cafc0","Type":"ContainerStarted","Data":"e1d2dc1686497efd6e24901efbd33c07027329305b90f5eb9331a7583d72b678"} Mar 14 09:23:00 crc kubenswrapper[4956]: I0314 09:23:00.312395 4956 generic.go:334] "Generic (PLEG): container finished" podID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerID="674a473c573497a0fc97d993bf2eb338e74f882ed41f2726d280415ae9593f84" exitCode=0 Mar 14 09:23:00 crc kubenswrapper[4956]: I0314 09:23:00.312465 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4t6l" event={"ID":"42a7bea9-8855-43a0-b8b1-f7354c53ac7b","Type":"ContainerDied","Data":"674a473c573497a0fc97d993bf2eb338e74f882ed41f2726d280415ae9593f84"} Mar 14 09:23:00 crc kubenswrapper[4956]: I0314 09:23:00.331946 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4n2nb" podStartSLOduration=3.983267406 podStartE2EDuration="9.331924273s" podCreationTimestamp="2026-03-14 09:22:51 +0000 UTC" firstStartedPulling="2026-03-14 09:22:54.232795639 +0000 UTC m=+1579.745487907" lastFinishedPulling="2026-03-14 09:22:59.581452506 +0000 UTC m=+1585.094144774" observedRunningTime="2026-03-14 09:23:00.324964419 +0000 UTC m=+1585.837656687" watchObservedRunningTime="2026-03-14 09:23:00.331924273 +0000 UTC m=+1585.844616541" Mar 14 09:23:01 crc kubenswrapper[4956]: I0314 09:23:01.322384 4956 generic.go:334] "Generic (PLEG): container finished" podID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerID="1aef38ff7fad65252d4fe6e373ba1da783d1c9b4f3543dd16c566537c684eee3" exitCode=0 Mar 14 09:23:01 crc kubenswrapper[4956]: I0314 09:23:01.322450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvsv" event={"ID":"577815e3-7633-4eb6-8725-5de6ee7cafc0","Type":"ContainerDied","Data":"1aef38ff7fad65252d4fe6e373ba1da783d1c9b4f3543dd16c566537c684eee3"} Mar 14 09:23:02 crc kubenswrapper[4956]: I0314 09:23:02.331534 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4t6l" event={"ID":"42a7bea9-8855-43a0-b8b1-f7354c53ac7b","Type":"ContainerStarted","Data":"2c77edea4870da8c1662da3de14dca88c78778c97f91e0c01ea31168fc738f18"} Mar 14 09:23:02 crc kubenswrapper[4956]: I0314 09:23:02.336141 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:23:02 crc kubenswrapper[4956]: I0314 09:23:02.336184 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:23:02 crc kubenswrapper[4956]: I0314 09:23:02.352265 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w4t6l" podStartSLOduration=3.529423517 podStartE2EDuration="8.352246389s" podCreationTimestamp="2026-03-14 09:22:54 +0000 UTC" firstStartedPulling="2026-03-14 09:22:56.530293955 +0000 UTC m=+1582.042986223" lastFinishedPulling="2026-03-14 09:23:01.353116827 +0000 UTC m=+1586.865809095" observedRunningTime="2026-03-14 09:23:02.346428883 +0000 UTC m=+1587.859121151" watchObservedRunningTime="2026-03-14 09:23:02.352246389 +0000 UTC m=+1587.864938657" Mar 14 09:23:02 crc kubenswrapper[4956]: I0314 09:23:02.382826 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:23:04 crc kubenswrapper[4956]: I0314 09:23:04.360970 4956 generic.go:334] "Generic (PLEG): container finished" podID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerID="a8e11ccddcd4f6e79255a443fc9973647f064d23beca482c550219bf7b5f09d0" exitCode=0 Mar 14 09:23:04 crc kubenswrapper[4956]: I0314 09:23:04.361023 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvsv" event={"ID":"577815e3-7633-4eb6-8725-5de6ee7cafc0","Type":"ContainerDied","Data":"a8e11ccddcd4f6e79255a443fc9973647f064d23beca482c550219bf7b5f09d0"} Mar 14 09:23:04 crc kubenswrapper[4956]: I0314 09:23:04.406286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:23:04 crc kubenswrapper[4956]: I0314 09:23:04.548982 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:23:04 crc kubenswrapper[4956]: I0314 09:23:04.549042 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:23:04 crc kubenswrapper[4956]: I0314 09:23:04.593418 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:23:06 crc kubenswrapper[4956]: I0314 09:23:06.380218 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvsv" event={"ID":"577815e3-7633-4eb6-8725-5de6ee7cafc0","Type":"ContainerStarted","Data":"0037c3cc85f8e4bf27d615a19a897b413805b04314c71956e71f8c4ed2ad6064"} Mar 14 09:23:06 crc kubenswrapper[4956]: I0314 09:23:06.400095 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gnvsv" podStartSLOduration=4.135459192 podStartE2EDuration="8.400077221s" podCreationTimestamp="2026-03-14 09:22:58 +0000 UTC" firstStartedPulling="2026-03-14 09:23:01.350651555 +0000 UTC m=+1586.863343823" lastFinishedPulling="2026-03-14 09:23:05.615269584 +0000 UTC m=+1591.127961852" observedRunningTime="2026-03-14 09:23:06.396386708 +0000 UTC m=+1591.909078996" watchObservedRunningTime="2026-03-14 09:23:06.400077221 +0000 UTC m=+1591.912769479" Mar 14 09:23:08 crc kubenswrapper[4956]: I0314 09:23:08.806018 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n2nb"] Mar 14 09:23:08 crc kubenswrapper[4956]: I0314 09:23:08.806821 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4n2nb" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="registry-server" containerID="cri-o://51b044d5bb7dcaecbae63912ee1ef1a79c23be633b75189474c7a769d4f0ec08" gracePeriod=2 Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.341155 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.341201 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.384335 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.406086 4956 generic.go:334] "Generic (PLEG): container finished" podID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerID="51b044d5bb7dcaecbae63912ee1ef1a79c23be633b75189474c7a769d4f0ec08" exitCode=0 Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.406181 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n2nb" event={"ID":"5be1551e-d8a2-4dda-8b7b-c75433011d34","Type":"ContainerDied","Data":"51b044d5bb7dcaecbae63912ee1ef1a79c23be633b75189474c7a769d4f0ec08"} Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.710019 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.789340 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkmrm\" (UniqueName: \"kubernetes.io/projected/5be1551e-d8a2-4dda-8b7b-c75433011d34-kube-api-access-mkmrm\") pod \"5be1551e-d8a2-4dda-8b7b-c75433011d34\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.789712 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-utilities\") pod \"5be1551e-d8a2-4dda-8b7b-c75433011d34\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.789808 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-catalog-content\") pod \"5be1551e-d8a2-4dda-8b7b-c75433011d34\" (UID: \"5be1551e-d8a2-4dda-8b7b-c75433011d34\") " Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.790987 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-utilities" (OuterVolumeSpecName: "utilities") pod "5be1551e-d8a2-4dda-8b7b-c75433011d34" (UID: "5be1551e-d8a2-4dda-8b7b-c75433011d34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.795631 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be1551e-d8a2-4dda-8b7b-c75433011d34-kube-api-access-mkmrm" (OuterVolumeSpecName: "kube-api-access-mkmrm") pod "5be1551e-d8a2-4dda-8b7b-c75433011d34" (UID: "5be1551e-d8a2-4dda-8b7b-c75433011d34"). InnerVolumeSpecName "kube-api-access-mkmrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.816001 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5be1551e-d8a2-4dda-8b7b-c75433011d34" (UID: "5be1551e-d8a2-4dda-8b7b-c75433011d34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.890811 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkmrm\" (UniqueName: \"kubernetes.io/projected/5be1551e-d8a2-4dda-8b7b-c75433011d34-kube-api-access-mkmrm\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.890848 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:09 crc kubenswrapper[4956]: I0314 09:23:09.890859 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be1551e-d8a2-4dda-8b7b-c75433011d34-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.417239 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n2nb" event={"ID":"5be1551e-d8a2-4dda-8b7b-c75433011d34","Type":"ContainerDied","Data":"24f5bee3b41653c1f1222c5919f07dc95119a6f8ca19f8dbb534f806b2f56968"} Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.417327 4956 scope.go:117] "RemoveContainer" containerID="51b044d5bb7dcaecbae63912ee1ef1a79c23be633b75189474c7a769d4f0ec08" Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.417337 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n2nb" Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.448240 4956 scope.go:117] "RemoveContainer" containerID="5edc1be9f7e3c9ac8e95db305f6f7632f419f641816dfcb4fda77699741f2cb7" Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.472125 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n2nb"] Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.475842 4956 scope.go:117] "RemoveContainer" containerID="05080a6e8e843e38dc819147ea7415194864da09811af823b1139522ba4f3c68" Mar 14 09:23:10 crc kubenswrapper[4956]: I0314 09:23:10.479671 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n2nb"] Mar 14 09:23:11 crc kubenswrapper[4956]: I0314 09:23:11.209375 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:23:11 crc kubenswrapper[4956]: E0314 09:23:11.210006 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:23:11 crc kubenswrapper[4956]: I0314 09:23:11.218563 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" path="/var/lib/kubelet/pods/5be1551e-d8a2-4dda-8b7b-c75433011d34/volumes" Mar 14 09:23:14 crc kubenswrapper[4956]: I0314 09:23:14.589967 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.204325 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w4t6l"] Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.204886 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w4t6l" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="registry-server" containerID="cri-o://2c77edea4870da8c1662da3de14dca88c78778c97f91e0c01ea31168fc738f18" gracePeriod=2 Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.384524 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.492268 4956 generic.go:334] "Generic (PLEG): container finished" podID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerID="2c77edea4870da8c1662da3de14dca88c78778c97f91e0c01ea31168fc738f18" exitCode=0 Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.492321 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4t6l" event={"ID":"42a7bea9-8855-43a0-b8b1-f7354c53ac7b","Type":"ContainerDied","Data":"2c77edea4870da8c1662da3de14dca88c78778c97f91e0c01ea31168fc738f18"} Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.639340 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.658009 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-utilities\") pod \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.658076 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-catalog-content\") pod \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.658131 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jnwv\" (UniqueName: \"kubernetes.io/projected/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-kube-api-access-9jnwv\") pod \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\" (UID: \"42a7bea9-8855-43a0-b8b1-f7354c53ac7b\") " Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.659020 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-utilities" (OuterVolumeSpecName: "utilities") pod "42a7bea9-8855-43a0-b8b1-f7354c53ac7b" (UID: "42a7bea9-8855-43a0-b8b1-f7354c53ac7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.666858 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-kube-api-access-9jnwv" (OuterVolumeSpecName: "kube-api-access-9jnwv") pod "42a7bea9-8855-43a0-b8b1-f7354c53ac7b" (UID: "42a7bea9-8855-43a0-b8b1-f7354c53ac7b"). InnerVolumeSpecName "kube-api-access-9jnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.714743 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42a7bea9-8855-43a0-b8b1-f7354c53ac7b" (UID: "42a7bea9-8855-43a0-b8b1-f7354c53ac7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.759807 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.759862 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:19 crc kubenswrapper[4956]: I0314 09:23:19.759876 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jnwv\" (UniqueName: \"kubernetes.io/projected/42a7bea9-8855-43a0-b8b1-f7354c53ac7b-kube-api-access-9jnwv\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.504262 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4t6l" event={"ID":"42a7bea9-8855-43a0-b8b1-f7354c53ac7b","Type":"ContainerDied","Data":"cd7449cca255672339aa0fd0a1eac1fe4b7f43b85705be8f17a2cfc19c86b432"} Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.504316 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4t6l" Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.504326 4956 scope.go:117] "RemoveContainer" containerID="2c77edea4870da8c1662da3de14dca88c78778c97f91e0c01ea31168fc738f18" Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.540122 4956 scope.go:117] "RemoveContainer" containerID="674a473c573497a0fc97d993bf2eb338e74f882ed41f2726d280415ae9593f84" Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.541952 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w4t6l"] Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.557589 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w4t6l"] Mar 14 09:23:20 crc kubenswrapper[4956]: I0314 09:23:20.562864 4956 scope.go:117] "RemoveContainer" containerID="8ff0bcf41ea10277a9b293a93aed8b45c0e4e25fe2f11ebac891cda612c6b08d" Mar 14 09:23:21 crc kubenswrapper[4956]: I0314 09:23:21.227494 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" path="/var/lib/kubelet/pods/42a7bea9-8855-43a0-b8b1-f7354c53ac7b/volumes" Mar 14 09:23:23 crc kubenswrapper[4956]: I0314 09:23:23.209689 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:23:23 crc kubenswrapper[4956]: E0314 09:23:23.210192 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.206150 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnvsv"] Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.206806 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gnvsv" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="registry-server" containerID="cri-o://0037c3cc85f8e4bf27d615a19a897b413805b04314c71956e71f8c4ed2ad6064" gracePeriod=2 Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.539599 4956 generic.go:334] "Generic (PLEG): container finished" podID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerID="0037c3cc85f8e4bf27d615a19a897b413805b04314c71956e71f8c4ed2ad6064" exitCode=0 Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.539649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvsv" event={"ID":"577815e3-7633-4eb6-8725-5de6ee7cafc0","Type":"ContainerDied","Data":"0037c3cc85f8e4bf27d615a19a897b413805b04314c71956e71f8c4ed2ad6064"} Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.609424 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.628859 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-catalog-content\") pod \"577815e3-7633-4eb6-8725-5de6ee7cafc0\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.628933 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t28gm\" (UniqueName: \"kubernetes.io/projected/577815e3-7633-4eb6-8725-5de6ee7cafc0-kube-api-access-t28gm\") pod \"577815e3-7633-4eb6-8725-5de6ee7cafc0\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.629188 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-utilities\") pod \"577815e3-7633-4eb6-8725-5de6ee7cafc0\" (UID: \"577815e3-7633-4eb6-8725-5de6ee7cafc0\") " Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.630637 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-utilities" (OuterVolumeSpecName: "utilities") pod "577815e3-7633-4eb6-8725-5de6ee7cafc0" (UID: "577815e3-7633-4eb6-8725-5de6ee7cafc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.635221 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577815e3-7633-4eb6-8725-5de6ee7cafc0-kube-api-access-t28gm" (OuterVolumeSpecName: "kube-api-access-t28gm") pod "577815e3-7633-4eb6-8725-5de6ee7cafc0" (UID: "577815e3-7633-4eb6-8725-5de6ee7cafc0"). InnerVolumeSpecName "kube-api-access-t28gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.682137 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "577815e3-7633-4eb6-8725-5de6ee7cafc0" (UID: "577815e3-7633-4eb6-8725-5de6ee7cafc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.731170 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.731213 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577815e3-7633-4eb6-8725-5de6ee7cafc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:24 crc kubenswrapper[4956]: I0314 09:23:24.731228 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t28gm\" (UniqueName: \"kubernetes.io/projected/577815e3-7633-4eb6-8725-5de6ee7cafc0-kube-api-access-t28gm\") on node \"crc\" DevicePath \"\"" Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.549628 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnvsv" event={"ID":"577815e3-7633-4eb6-8725-5de6ee7cafc0","Type":"ContainerDied","Data":"e1d2dc1686497efd6e24901efbd33c07027329305b90f5eb9331a7583d72b678"} Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.550721 4956 scope.go:117] "RemoveContainer" containerID="0037c3cc85f8e4bf27d615a19a897b413805b04314c71956e71f8c4ed2ad6064" Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.549683 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnvsv" Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.574752 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnvsv"] Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.576059 4956 scope.go:117] "RemoveContainer" containerID="a8e11ccddcd4f6e79255a443fc9973647f064d23beca482c550219bf7b5f09d0" Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.581944 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gnvsv"] Mar 14 09:23:25 crc kubenswrapper[4956]: I0314 09:23:25.593437 4956 scope.go:117] "RemoveContainer" containerID="1aef38ff7fad65252d4fe6e373ba1da783d1c9b4f3543dd16c566537c684eee3" Mar 14 09:23:27 crc kubenswrapper[4956]: I0314 09:23:27.219911 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" path="/var/lib/kubelet/pods/577815e3-7633-4eb6-8725-5de6ee7cafc0/volumes" Mar 14 09:23:37 crc kubenswrapper[4956]: I0314 09:23:37.209666 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:23:37 crc kubenswrapper[4956]: E0314 09:23:37.210344 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:23:50 crc kubenswrapper[4956]: I0314 09:23:50.208880 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:23:50 crc kubenswrapper[4956]: E0314 09:23:50.210694 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.145771 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558004-dgrj5"] Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146735 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="extract-utilities" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146756 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="extract-utilities" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146778 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="extract-utilities" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146786 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="extract-utilities" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146801 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146810 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146824 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146832 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146842 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146852 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146867 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="extract-content" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146873 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="extract-content" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146903 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="extract-content" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146913 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="extract-content" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146924 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="extract-utilities" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146931 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="extract-utilities" Mar 14 09:24:00 crc kubenswrapper[4956]: E0314 09:24:00.146944 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="extract-content" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.146952 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="extract-content" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.147186 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be1551e-d8a2-4dda-8b7b-c75433011d34" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.147213 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="577815e3-7633-4eb6-8725-5de6ee7cafc0" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.147229 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a7bea9-8855-43a0-b8b1-f7354c53ac7b" containerName="registry-server" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.147992 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.150518 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.153128 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.153607 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.159755 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-dgrj5"] Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.292607 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/605036d5-9d48-4852-b956-565c2659cb36-kube-api-access-pwlbc\") pod \"auto-csr-approver-29558004-dgrj5\" (UID: \"605036d5-9d48-4852-b956-565c2659cb36\") " pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.394464 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/605036d5-9d48-4852-b956-565c2659cb36-kube-api-access-pwlbc\") pod \"auto-csr-approver-29558004-dgrj5\" (UID: \"605036d5-9d48-4852-b956-565c2659cb36\") " pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.420052 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/605036d5-9d48-4852-b956-565c2659cb36-kube-api-access-pwlbc\") pod \"auto-csr-approver-29558004-dgrj5\" (UID: \"605036d5-9d48-4852-b956-565c2659cb36\") " pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.469260 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.884800 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-dgrj5"] Mar 14 09:24:00 crc kubenswrapper[4956]: I0314 09:24:00.922466 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" event={"ID":"605036d5-9d48-4852-b956-565c2659cb36","Type":"ContainerStarted","Data":"228899053f88f6151eb359a4c5b6da9019986db4374d57a967cafc781695d4ba"} Mar 14 09:24:01 crc kubenswrapper[4956]: I0314 09:24:01.209777 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:24:01 crc kubenswrapper[4956]: E0314 09:24:01.210075 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:24:02 crc kubenswrapper[4956]: I0314 09:24:02.940276 4956 generic.go:334] "Generic (PLEG): container finished" podID="605036d5-9d48-4852-b956-565c2659cb36" containerID="ac99d14e2e0d7c316131b7e8e8c690b2aa28487343409ff55615848a1a8bf282" exitCode=0 Mar 14 09:24:02 crc kubenswrapper[4956]: I0314 09:24:02.940426 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" event={"ID":"605036d5-9d48-4852-b956-565c2659cb36","Type":"ContainerDied","Data":"ac99d14e2e0d7c316131b7e8e8c690b2aa28487343409ff55615848a1a8bf282"} Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.241505 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.356718 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/605036d5-9d48-4852-b956-565c2659cb36-kube-api-access-pwlbc\") pod \"605036d5-9d48-4852-b956-565c2659cb36\" (UID: \"605036d5-9d48-4852-b956-565c2659cb36\") " Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.363303 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605036d5-9d48-4852-b956-565c2659cb36-kube-api-access-pwlbc" (OuterVolumeSpecName: "kube-api-access-pwlbc") pod "605036d5-9d48-4852-b956-565c2659cb36" (UID: "605036d5-9d48-4852-b956-565c2659cb36"). InnerVolumeSpecName "kube-api-access-pwlbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.458411 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/605036d5-9d48-4852-b956-565c2659cb36-kube-api-access-pwlbc\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.959396 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" event={"ID":"605036d5-9d48-4852-b956-565c2659cb36","Type":"ContainerDied","Data":"228899053f88f6151eb359a4c5b6da9019986db4374d57a967cafc781695d4ba"} Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.959448 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228899053f88f6151eb359a4c5b6da9019986db4374d57a967cafc781695d4ba" Mar 14 09:24:04 crc kubenswrapper[4956]: I0314 09:24:04.959453 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-dgrj5" Mar 14 09:24:05 crc kubenswrapper[4956]: I0314 09:24:05.308895 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-xqk25"] Mar 14 09:24:05 crc kubenswrapper[4956]: I0314 09:24:05.317657 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-xqk25"] Mar 14 09:24:07 crc kubenswrapper[4956]: I0314 09:24:07.221364 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847bf01f-0dfa-424b-ae2f-8dba3e277a5c" path="/var/lib/kubelet/pods/847bf01f-0dfa-424b-ae2f-8dba3e277a5c/volumes" Mar 14 09:24:14 crc kubenswrapper[4956]: I0314 09:24:14.209061 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:24:14 crc kubenswrapper[4956]: E0314 09:24:14.209860 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:24:26 crc kubenswrapper[4956]: I0314 09:24:26.209857 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:24:26 crc kubenswrapper[4956]: E0314 09:24:26.211048 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:24:40 crc kubenswrapper[4956]: I0314 09:24:40.209822 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:24:40 crc kubenswrapper[4956]: E0314 09:24:40.210668 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:24:51 crc kubenswrapper[4956]: I0314 09:24:51.209212 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:24:51 crc kubenswrapper[4956]: E0314 09:24:51.210278 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:24:54 crc kubenswrapper[4956]: I0314 09:24:54.387614 4956 scope.go:117] "RemoveContainer" containerID="37994017928c556ac062cb21c6afdf9728f5a0725895374937786528ae908194" Mar 14 09:24:54 crc kubenswrapper[4956]: I0314 09:24:54.423996 4956 scope.go:117] "RemoveContainer" containerID="b5e7ad7c58aa5b7bc28a30c20af15a6749969875cb4bf9ad9a0ad6f7e4d12340" Mar 14 09:25:04 crc kubenswrapper[4956]: I0314 09:25:04.208971 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:25:04 crc kubenswrapper[4956]: E0314 09:25:04.209720 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:25:15 crc kubenswrapper[4956]: I0314 09:25:15.214042 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:25:15 crc kubenswrapper[4956]: E0314 09:25:15.214723 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:25:28 crc kubenswrapper[4956]: I0314 09:25:28.209629 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:25:28 crc kubenswrapper[4956]: E0314 09:25:28.210655 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:25:41 crc kubenswrapper[4956]: I0314 09:25:41.209416 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:25:41 crc kubenswrapper[4956]: E0314 09:25:41.210430 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:25:53 crc kubenswrapper[4956]: I0314 09:25:53.209169 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:25:53 crc kubenswrapper[4956]: E0314 09:25:53.209974 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.139179 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558006-ms242"] Mar 14 09:26:00 crc kubenswrapper[4956]: E0314 09:26:00.140217 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605036d5-9d48-4852-b956-565c2659cb36" containerName="oc" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.140234 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="605036d5-9d48-4852-b956-565c2659cb36" containerName="oc" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.140433 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="605036d5-9d48-4852-b956-565c2659cb36" containerName="oc" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.141311 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.145127 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.145960 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.148391 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.152239 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-ms242"] Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.237589 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2w9m\" (UniqueName: \"kubernetes.io/projected/2283b32b-1f0b-4db4-b7c7-d31b7e162806-kube-api-access-b2w9m\") pod \"auto-csr-approver-29558006-ms242\" (UID: \"2283b32b-1f0b-4db4-b7c7-d31b7e162806\") " pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.340626 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2w9m\" (UniqueName: \"kubernetes.io/projected/2283b32b-1f0b-4db4-b7c7-d31b7e162806-kube-api-access-b2w9m\") pod \"auto-csr-approver-29558006-ms242\" (UID: \"2283b32b-1f0b-4db4-b7c7-d31b7e162806\") " pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.365771 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2w9m\" (UniqueName: \"kubernetes.io/projected/2283b32b-1f0b-4db4-b7c7-d31b7e162806-kube-api-access-b2w9m\") pod \"auto-csr-approver-29558006-ms242\" (UID: \"2283b32b-1f0b-4db4-b7c7-d31b7e162806\") " pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.459357 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:00 crc kubenswrapper[4956]: I0314 09:26:00.921046 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-ms242"] Mar 14 09:26:01 crc kubenswrapper[4956]: I0314 09:26:01.933706 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-ms242" event={"ID":"2283b32b-1f0b-4db4-b7c7-d31b7e162806","Type":"ContainerStarted","Data":"369f935b7c6e9fe11bf36f0767b4924b36bc1f27b343b1ecc9a0d93975c7f981"} Mar 14 09:26:02 crc kubenswrapper[4956]: I0314 09:26:02.960683 4956 generic.go:334] "Generic (PLEG): container finished" podID="2283b32b-1f0b-4db4-b7c7-d31b7e162806" containerID="869e41b959441f202198c6d10e3149b7e036d45e993b0afa9acbc3288b697396" exitCode=0 Mar 14 09:26:02 crc kubenswrapper[4956]: I0314 09:26:02.960730 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-ms242" event={"ID":"2283b32b-1f0b-4db4-b7c7-d31b7e162806","Type":"ContainerDied","Data":"869e41b959441f202198c6d10e3149b7e036d45e993b0afa9acbc3288b697396"} Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.251077 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.409235 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2w9m\" (UniqueName: \"kubernetes.io/projected/2283b32b-1f0b-4db4-b7c7-d31b7e162806-kube-api-access-b2w9m\") pod \"2283b32b-1f0b-4db4-b7c7-d31b7e162806\" (UID: \"2283b32b-1f0b-4db4-b7c7-d31b7e162806\") " Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.415431 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2283b32b-1f0b-4db4-b7c7-d31b7e162806-kube-api-access-b2w9m" (OuterVolumeSpecName: "kube-api-access-b2w9m") pod "2283b32b-1f0b-4db4-b7c7-d31b7e162806" (UID: "2283b32b-1f0b-4db4-b7c7-d31b7e162806"). InnerVolumeSpecName "kube-api-access-b2w9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.510994 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2w9m\" (UniqueName: \"kubernetes.io/projected/2283b32b-1f0b-4db4-b7c7-d31b7e162806-kube-api-access-b2w9m\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.978600 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-ms242" event={"ID":"2283b32b-1f0b-4db4-b7c7-d31b7e162806","Type":"ContainerDied","Data":"369f935b7c6e9fe11bf36f0767b4924b36bc1f27b343b1ecc9a0d93975c7f981"} Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.978648 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="369f935b7c6e9fe11bf36f0767b4924b36bc1f27b343b1ecc9a0d93975c7f981" Mar 14 09:26:04 crc kubenswrapper[4956]: I0314 09:26:04.978674 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-ms242" Mar 14 09:26:05 crc kubenswrapper[4956]: I0314 09:26:05.332045 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-7z52v"] Mar 14 09:26:05 crc kubenswrapper[4956]: I0314 09:26:05.340576 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-7z52v"] Mar 14 09:26:07 crc kubenswrapper[4956]: I0314 09:26:07.222061 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9098de-4e80-4b2f-860d-065fac0cc574" path="/var/lib/kubelet/pods/4e9098de-4e80-4b2f-860d-065fac0cc574/volumes" Mar 14 09:26:08 crc kubenswrapper[4956]: I0314 09:26:08.209738 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:26:08 crc kubenswrapper[4956]: E0314 09:26:08.210052 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:26:20 crc kubenswrapper[4956]: I0314 09:26:20.209652 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:26:20 crc kubenswrapper[4956]: E0314 09:26:20.210322 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:26:34 crc kubenswrapper[4956]: I0314 09:26:34.209525 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:26:34 crc kubenswrapper[4956]: E0314 09:26:34.210242 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:26:47 crc kubenswrapper[4956]: I0314 09:26:47.209606 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:26:47 crc kubenswrapper[4956]: E0314 09:26:47.210329 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:26:54 crc kubenswrapper[4956]: I0314 09:26:54.512152 4956 scope.go:117] "RemoveContainer" containerID="19d19b6f9bdb20d58e7b55d660211ff8fb28b46014f7ea8bf2cac697e1d7cecf" Mar 14 09:27:01 crc kubenswrapper[4956]: I0314 09:27:01.209544 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:27:01 crc kubenswrapper[4956]: E0314 09:27:01.211934 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:27:13 crc kubenswrapper[4956]: I0314 09:27:13.209768 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:27:13 crc kubenswrapper[4956]: E0314 09:27:13.210888 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:27:26 crc kubenswrapper[4956]: I0314 09:27:26.210217 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:27:26 crc kubenswrapper[4956]: E0314 09:27:26.211142 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:27:37 crc kubenswrapper[4956]: I0314 09:27:37.209705 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:27:37 crc kubenswrapper[4956]: E0314 09:27:37.210456 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:27:38 crc kubenswrapper[4956]: I0314 09:27:38.134878 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/root-account-create-update-tj699"] Mar 14 09:27:38 crc kubenswrapper[4956]: I0314 09:27:38.169731 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/root-account-create-update-tj699"] Mar 14 09:27:39 crc kubenswrapper[4956]: I0314 09:27:39.219280 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c370ae95-6bb5-4a51-88ef-d0ce6811faa3" path="/var/lib/kubelet/pods/c370ae95-6bb5-4a51-88ef-d0ce6811faa3/volumes" Mar 14 09:27:40 crc kubenswrapper[4956]: I0314 09:27:40.031469 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz"] Mar 14 09:27:40 crc kubenswrapper[4956]: I0314 09:27:40.044217 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-create-f89ks"] Mar 14 09:27:40 crc kubenswrapper[4956]: I0314 09:27:40.057007 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-ee8a-account-create-update-gqmgz"] Mar 14 09:27:40 crc kubenswrapper[4956]: I0314 09:27:40.065412 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-create-f89ks"] Mar 14 09:27:41 crc kubenswrapper[4956]: I0314 09:27:41.224410 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d63538-b0d7-4e51-9b51-8c3855488802" path="/var/lib/kubelet/pods/44d63538-b0d7-4e51-9b51-8c3855488802/volumes" Mar 14 09:27:41 crc kubenswrapper[4956]: I0314 09:27:41.225790 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8dd1e13-b758-484a-9310-0fbd88fdd7ca" path="/var/lib/kubelet/pods/d8dd1e13-b758-484a-9310-0fbd88fdd7ca/volumes" Mar 14 09:27:51 crc kubenswrapper[4956]: I0314 09:27:51.209869 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:27:51 crc kubenswrapper[4956]: E0314 09:27:51.210681 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:27:54 crc kubenswrapper[4956]: I0314 09:27:54.593119 4956 scope.go:117] "RemoveContainer" containerID="54e3e91deedba475cc5c77f2aa4db0c5b6cfa2ef820109ee65a47f06539fc0c0" Mar 14 09:27:54 crc kubenswrapper[4956]: I0314 09:27:54.624919 4956 scope.go:117] "RemoveContainer" containerID="fa2bcc70198965d972eb6a2ee18e6c80aa308cdafc9b18fb7fb6003c6cb14ee9" Mar 14 09:27:54 crc kubenswrapper[4956]: I0314 09:27:54.658662 4956 scope.go:117] "RemoveContainer" containerID="a60df08c4fee37c3b526cb8696446bb6aacb51461f984650f76a65bd96b9d973" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.145504 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558008-vhbd6"] Mar 14 09:28:00 crc kubenswrapper[4956]: E0314 09:28:00.146280 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2283b32b-1f0b-4db4-b7c7-d31b7e162806" containerName="oc" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.146298 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2283b32b-1f0b-4db4-b7c7-d31b7e162806" containerName="oc" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.146535 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2283b32b-1f0b-4db4-b7c7-d31b7e162806" containerName="oc" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.147117 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.149650 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.150070 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.151970 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.154530 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-vhbd6"] Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.256116 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67d47\" (UniqueName: \"kubernetes.io/projected/2944c10a-78bd-49e1-a6a0-99db82d462f5-kube-api-access-67d47\") pod \"auto-csr-approver-29558008-vhbd6\" (UID: \"2944c10a-78bd-49e1-a6a0-99db82d462f5\") " pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.357804 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67d47\" (UniqueName: \"kubernetes.io/projected/2944c10a-78bd-49e1-a6a0-99db82d462f5-kube-api-access-67d47\") pod \"auto-csr-approver-29558008-vhbd6\" (UID: \"2944c10a-78bd-49e1-a6a0-99db82d462f5\") " pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.379273 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67d47\" (UniqueName: \"kubernetes.io/projected/2944c10a-78bd-49e1-a6a0-99db82d462f5-kube-api-access-67d47\") pod \"auto-csr-approver-29558008-vhbd6\" (UID: \"2944c10a-78bd-49e1-a6a0-99db82d462f5\") " pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.467828 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.882570 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-vhbd6"] Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.899439 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:28:00 crc kubenswrapper[4956]: I0314 09:28:00.955467 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" event={"ID":"2944c10a-78bd-49e1-a6a0-99db82d462f5","Type":"ContainerStarted","Data":"780c7faa7003daa72e757b497638a5ad7d0c961d8576dbc0b124e23a3b6320f8"} Mar 14 09:28:02 crc kubenswrapper[4956]: I0314 09:28:02.971312 4956 generic.go:334] "Generic (PLEG): container finished" podID="2944c10a-78bd-49e1-a6a0-99db82d462f5" containerID="c0a5cff7e539be9cad9438a4453c71493a7597a9f8bb338ab357327b2a140b30" exitCode=0 Mar 14 09:28:02 crc kubenswrapper[4956]: I0314 09:28:02.971410 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" event={"ID":"2944c10a-78bd-49e1-a6a0-99db82d462f5","Type":"ContainerDied","Data":"c0a5cff7e539be9cad9438a4453c71493a7597a9f8bb338ab357327b2a140b30"} Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.209721 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.321390 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.420889 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67d47\" (UniqueName: \"kubernetes.io/projected/2944c10a-78bd-49e1-a6a0-99db82d462f5-kube-api-access-67d47\") pod \"2944c10a-78bd-49e1-a6a0-99db82d462f5\" (UID: \"2944c10a-78bd-49e1-a6a0-99db82d462f5\") " Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.433458 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2944c10a-78bd-49e1-a6a0-99db82d462f5-kube-api-access-67d47" (OuterVolumeSpecName: "kube-api-access-67d47") pod "2944c10a-78bd-49e1-a6a0-99db82d462f5" (UID: "2944c10a-78bd-49e1-a6a0-99db82d462f5"). InnerVolumeSpecName "kube-api-access-67d47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.522341 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67d47\" (UniqueName: \"kubernetes.io/projected/2944c10a-78bd-49e1-a6a0-99db82d462f5-kube-api-access-67d47\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.991933 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.991898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-vhbd6" event={"ID":"2944c10a-78bd-49e1-a6a0-99db82d462f5","Type":"ContainerDied","Data":"780c7faa7003daa72e757b497638a5ad7d0c961d8576dbc0b124e23a3b6320f8"} Mar 14 09:28:04 crc kubenswrapper[4956]: I0314 09:28:04.992143 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="780c7faa7003daa72e757b497638a5ad7d0c961d8576dbc0b124e23a3b6320f8" Mar 14 09:28:05 crc kubenswrapper[4956]: I0314 09:28:05.000603 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"75c968c7158a86ccc8a0fc41e6a65ef91e9a8b27c10e35460c675ff21752f263"} Mar 14 09:28:05 crc kubenswrapper[4956]: I0314 09:28:05.387292 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-9zjfh"] Mar 14 09:28:05 crc kubenswrapper[4956]: I0314 09:28:05.395146 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-9zjfh"] Mar 14 09:28:07 crc kubenswrapper[4956]: I0314 09:28:07.220365 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b915a811-e123-4d41-a773-f1610198bb90" path="/var/lib/kubelet/pods/b915a811-e123-4d41-a773-f1610198bb90/volumes" Mar 14 09:28:08 crc kubenswrapper[4956]: I0314 09:28:08.041746 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fqjr8"] Mar 14 09:28:08 crc kubenswrapper[4956]: I0314 09:28:08.052043 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fqjr8"] Mar 14 09:28:09 crc kubenswrapper[4956]: I0314 09:28:09.220000 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac0e60a-75b8-410d-b536-8c91aac1873a" path="/var/lib/kubelet/pods/dac0e60a-75b8-410d-b536-8c91aac1873a/volumes" Mar 14 09:28:29 crc kubenswrapper[4956]: I0314 09:28:29.042313 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-4gqxj"] Mar 14 09:28:29 crc kubenswrapper[4956]: I0314 09:28:29.048815 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-4gqxj"] Mar 14 09:28:29 crc kubenswrapper[4956]: I0314 09:28:29.219092 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4212e38-b90e-4dc4-b431-c8b943624ffd" path="/var/lib/kubelet/pods/f4212e38-b90e-4dc4-b431-c8b943624ffd/volumes" Mar 14 09:28:54 crc kubenswrapper[4956]: I0314 09:28:54.735347 4956 scope.go:117] "RemoveContainer" containerID="d7ee5b91e6edffce07e8bbffe9903192a3da277e243b2ad852e691ac9ad013bf" Mar 14 09:28:54 crc kubenswrapper[4956]: I0314 09:28:54.799337 4956 scope.go:117] "RemoveContainer" containerID="f3950bca3977a4c804cecde459e8d39654c6e790ad12f6dcaa7d9545ae4b2406" Mar 14 09:28:54 crc kubenswrapper[4956]: I0314 09:28:54.855979 4956 scope.go:117] "RemoveContainer" containerID="6b8c0650e8787f9a9f1dfcb7fb9f7f30bfe5c4005e10c6d8c6b74845f8cb733d" Mar 14 09:29:24 crc kubenswrapper[4956]: I0314 09:29:24.063594 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-m6s6v"] Mar 14 09:29:24 crc kubenswrapper[4956]: I0314 09:29:24.077548 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-0215-account-create-update-5sn9p"] Mar 14 09:29:24 crc kubenswrapper[4956]: I0314 09:29:24.088726 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-m6s6v"] Mar 14 09:29:24 crc kubenswrapper[4956]: I0314 09:29:24.104630 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-0215-account-create-update-5sn9p"] Mar 14 09:29:25 crc kubenswrapper[4956]: I0314 09:29:25.219870 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57cae041-75e9-407f-8d18-3c0da11e7a73" path="/var/lib/kubelet/pods/57cae041-75e9-407f-8d18-3c0da11e7a73/volumes" Mar 14 09:29:25 crc kubenswrapper[4956]: I0314 09:29:25.220561 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddaf68d2-3b2c-4293-a453-ecbeb31bea81" path="/var/lib/kubelet/pods/ddaf68d2-3b2c-4293-a453-ecbeb31bea81/volumes" Mar 14 09:29:54 crc kubenswrapper[4956]: I0314 09:29:54.942353 4956 scope.go:117] "RemoveContainer" containerID="99d6e24ce25eefdbf16f95231746247c8f258c2075ab3b2488e756c82c682b8f" Mar 14 09:29:54 crc kubenswrapper[4956]: I0314 09:29:54.965926 4956 scope.go:117] "RemoveContainer" containerID="36ff6362dff416d20c94f92e2f02b441454d355f249f615a267c686e7d6fd339" Mar 14 09:29:57 crc kubenswrapper[4956]: I0314 09:29:57.076461 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv"] Mar 14 09:29:57 crc kubenswrapper[4956]: I0314 09:29:57.090070 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hfdcv"] Mar 14 09:29:57 crc kubenswrapper[4956]: I0314 09:29:57.219893 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a547039f-2d8f-45d4-8b32-05fa39f51b9c" path="/var/lib/kubelet/pods/a547039f-2d8f-45d4-8b32-05fa39f51b9c/volumes" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.146275 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558010-9n578"] Mar 14 09:30:00 crc kubenswrapper[4956]: E0314 09:30:00.147008 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2944c10a-78bd-49e1-a6a0-99db82d462f5" containerName="oc" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.147021 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2944c10a-78bd-49e1-a6a0-99db82d462f5" containerName="oc" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.147185 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2944c10a-78bd-49e1-a6a0-99db82d462f5" containerName="oc" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.147785 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.150709 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.150716 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.150716 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.157532 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t"] Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.158853 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.161016 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.163171 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.178315 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t"] Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.189612 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-9n578"] Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.298240 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25604e27-1692-429b-bb26-e24e41514227-config-volume\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.298752 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25604e27-1692-429b-bb26-e24e41514227-secret-volume\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.298790 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdznr\" (UniqueName: \"kubernetes.io/projected/5322a936-a1ad-4a11-8264-7f6d088026ac-kube-api-access-cdznr\") pod \"auto-csr-approver-29558010-9n578\" (UID: \"5322a936-a1ad-4a11-8264-7f6d088026ac\") " pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.298855 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvh9\" (UniqueName: \"kubernetes.io/projected/25604e27-1692-429b-bb26-e24e41514227-kube-api-access-xqvh9\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.399438 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25604e27-1692-429b-bb26-e24e41514227-config-volume\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.399550 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25604e27-1692-429b-bb26-e24e41514227-secret-volume\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.399575 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdznr\" (UniqueName: \"kubernetes.io/projected/5322a936-a1ad-4a11-8264-7f6d088026ac-kube-api-access-cdznr\") pod \"auto-csr-approver-29558010-9n578\" (UID: \"5322a936-a1ad-4a11-8264-7f6d088026ac\") " pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.399602 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvh9\" (UniqueName: \"kubernetes.io/projected/25604e27-1692-429b-bb26-e24e41514227-kube-api-access-xqvh9\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.400813 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25604e27-1692-429b-bb26-e24e41514227-config-volume\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.406367 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25604e27-1692-429b-bb26-e24e41514227-secret-volume\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.417292 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdznr\" (UniqueName: \"kubernetes.io/projected/5322a936-a1ad-4a11-8264-7f6d088026ac-kube-api-access-cdznr\") pod \"auto-csr-approver-29558010-9n578\" (UID: \"5322a936-a1ad-4a11-8264-7f6d088026ac\") " pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.418234 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvh9\" (UniqueName: \"kubernetes.io/projected/25604e27-1692-429b-bb26-e24e41514227-kube-api-access-xqvh9\") pod \"collect-profiles-29558010-2fx5t\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.476216 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.487446 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.938826 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-9n578"] Mar 14 09:30:00 crc kubenswrapper[4956]: I0314 09:30:00.998290 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t"] Mar 14 09:30:00 crc kubenswrapper[4956]: W0314 09:30:00.999597 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25604e27_1692_429b_bb26_e24e41514227.slice/crio-67483ec9add85eb7518967ca9c6e794a91bc0bb4409e1cf3080d2c6510a04a6f WatchSource:0}: Error finding container 67483ec9add85eb7518967ca9c6e794a91bc0bb4409e1cf3080d2c6510a04a6f: Status 404 returned error can't find the container with id 67483ec9add85eb7518967ca9c6e794a91bc0bb4409e1cf3080d2c6510a04a6f Mar 14 09:30:01 crc kubenswrapper[4956]: I0314 09:30:01.189887 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-9n578" event={"ID":"5322a936-a1ad-4a11-8264-7f6d088026ac","Type":"ContainerStarted","Data":"9a8e53c20bd5f9669970466ae2c9c7d86449fef90438d04690230def2d3fb7e4"} Mar 14 09:30:01 crc kubenswrapper[4956]: I0314 09:30:01.191023 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" event={"ID":"25604e27-1692-429b-bb26-e24e41514227","Type":"ContainerStarted","Data":"2a6dc52af56f9f1a0f49d61b039b69c8f7fba1af23da0b99f2d1959c6b97b67e"} Mar 14 09:30:01 crc kubenswrapper[4956]: I0314 09:30:01.191057 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" event={"ID":"25604e27-1692-429b-bb26-e24e41514227","Type":"ContainerStarted","Data":"67483ec9add85eb7518967ca9c6e794a91bc0bb4409e1cf3080d2c6510a04a6f"} Mar 14 09:30:01 crc kubenswrapper[4956]: I0314 09:30:01.219755 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" podStartSLOduration=1.219727058 podStartE2EDuration="1.219727058s" podCreationTimestamp="2026-03-14 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:30:01.203570273 +0000 UTC m=+2006.716262541" watchObservedRunningTime="2026-03-14 09:30:01.219727058 +0000 UTC m=+2006.732419326" Mar 14 09:30:02 crc kubenswrapper[4956]: I0314 09:30:02.201751 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-9n578" event={"ID":"5322a936-a1ad-4a11-8264-7f6d088026ac","Type":"ContainerStarted","Data":"3869937ff7b9ac2d31c83c6cb681ff0d14457d6147b97297ad97bec680ab6536"} Mar 14 09:30:02 crc kubenswrapper[4956]: I0314 09:30:02.203263 4956 generic.go:334] "Generic (PLEG): container finished" podID="25604e27-1692-429b-bb26-e24e41514227" containerID="2a6dc52af56f9f1a0f49d61b039b69c8f7fba1af23da0b99f2d1959c6b97b67e" exitCode=0 Mar 14 09:30:02 crc kubenswrapper[4956]: I0314 09:30:02.203298 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" event={"ID":"25604e27-1692-429b-bb26-e24e41514227","Type":"ContainerDied","Data":"2a6dc52af56f9f1a0f49d61b039b69c8f7fba1af23da0b99f2d1959c6b97b67e"} Mar 14 09:30:02 crc kubenswrapper[4956]: I0314 09:30:02.222816 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558010-9n578" podStartSLOduration=1.39111214 podStartE2EDuration="2.222795473s" podCreationTimestamp="2026-03-14 09:30:00 +0000 UTC" firstStartedPulling="2026-03-14 09:30:00.945731552 +0000 UTC m=+2006.458423820" lastFinishedPulling="2026-03-14 09:30:01.777414885 +0000 UTC m=+2007.290107153" observedRunningTime="2026-03-14 09:30:02.218940646 +0000 UTC m=+2007.731632904" watchObservedRunningTime="2026-03-14 09:30:02.222795473 +0000 UTC m=+2007.735487731" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.213413 4956 generic.go:334] "Generic (PLEG): container finished" podID="5322a936-a1ad-4a11-8264-7f6d088026ac" containerID="3869937ff7b9ac2d31c83c6cb681ff0d14457d6147b97297ad97bec680ab6536" exitCode=0 Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.224216 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-9n578" event={"ID":"5322a936-a1ad-4a11-8264-7f6d088026ac","Type":"ContainerDied","Data":"3869937ff7b9ac2d31c83c6cb681ff0d14457d6147b97297ad97bec680ab6536"} Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.496835 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.653252 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25604e27-1692-429b-bb26-e24e41514227-secret-volume\") pod \"25604e27-1692-429b-bb26-e24e41514227\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.653928 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25604e27-1692-429b-bb26-e24e41514227-config-volume\") pod \"25604e27-1692-429b-bb26-e24e41514227\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.654074 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqvh9\" (UniqueName: \"kubernetes.io/projected/25604e27-1692-429b-bb26-e24e41514227-kube-api-access-xqvh9\") pod \"25604e27-1692-429b-bb26-e24e41514227\" (UID: \"25604e27-1692-429b-bb26-e24e41514227\") " Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.655075 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25604e27-1692-429b-bb26-e24e41514227-config-volume" (OuterVolumeSpecName: "config-volume") pod "25604e27-1692-429b-bb26-e24e41514227" (UID: "25604e27-1692-429b-bb26-e24e41514227"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.658783 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25604e27-1692-429b-bb26-e24e41514227-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25604e27-1692-429b-bb26-e24e41514227" (UID: "25604e27-1692-429b-bb26-e24e41514227"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.659047 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25604e27-1692-429b-bb26-e24e41514227-kube-api-access-xqvh9" (OuterVolumeSpecName: "kube-api-access-xqvh9") pod "25604e27-1692-429b-bb26-e24e41514227" (UID: "25604e27-1692-429b-bb26-e24e41514227"). InnerVolumeSpecName "kube-api-access-xqvh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.755432 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25604e27-1692-429b-bb26-e24e41514227-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.755765 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25604e27-1692-429b-bb26-e24e41514227-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[4956]: I0314 09:30:03.755859 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqvh9\" (UniqueName: \"kubernetes.io/projected/25604e27-1692-429b-bb26-e24e41514227-kube-api-access-xqvh9\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.222046 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.222555 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-2fx5t" event={"ID":"25604e27-1692-429b-bb26-e24e41514227","Type":"ContainerDied","Data":"67483ec9add85eb7518967ca9c6e794a91bc0bb4409e1cf3080d2c6510a04a6f"} Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.222585 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67483ec9add85eb7518967ca9c6e794a91bc0bb4409e1cf3080d2c6510a04a6f" Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.273659 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr"] Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.279957 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-jz6kr"] Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.492400 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.671515 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdznr\" (UniqueName: \"kubernetes.io/projected/5322a936-a1ad-4a11-8264-7f6d088026ac-kube-api-access-cdznr\") pod \"5322a936-a1ad-4a11-8264-7f6d088026ac\" (UID: \"5322a936-a1ad-4a11-8264-7f6d088026ac\") " Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.676283 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5322a936-a1ad-4a11-8264-7f6d088026ac-kube-api-access-cdznr" (OuterVolumeSpecName: "kube-api-access-cdznr") pod "5322a936-a1ad-4a11-8264-7f6d088026ac" (UID: "5322a936-a1ad-4a11-8264-7f6d088026ac"). InnerVolumeSpecName "kube-api-access-cdznr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:04 crc kubenswrapper[4956]: I0314 09:30:04.773801 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdznr\" (UniqueName: \"kubernetes.io/projected/5322a936-a1ad-4a11-8264-7f6d088026ac-kube-api-access-cdznr\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:05 crc kubenswrapper[4956]: I0314 09:30:05.219802 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6e9606-19aa-43f7-8344-ebc9f5c3f31a" path="/var/lib/kubelet/pods/ea6e9606-19aa-43f7-8344-ebc9f5c3f31a/volumes" Mar 14 09:30:05 crc kubenswrapper[4956]: I0314 09:30:05.231907 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-9n578" event={"ID":"5322a936-a1ad-4a11-8264-7f6d088026ac","Type":"ContainerDied","Data":"9a8e53c20bd5f9669970466ae2c9c7d86449fef90438d04690230def2d3fb7e4"} Mar 14 09:30:05 crc kubenswrapper[4956]: I0314 09:30:05.232497 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8e53c20bd5f9669970466ae2c9c7d86449fef90438d04690230def2d3fb7e4" Mar 14 09:30:05 crc kubenswrapper[4956]: I0314 09:30:05.231964 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-9n578" Mar 14 09:30:05 crc kubenswrapper[4956]: I0314 09:30:05.286380 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-dgrj5"] Mar 14 09:30:05 crc kubenswrapper[4956]: I0314 09:30:05.293646 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-dgrj5"] Mar 14 09:30:07 crc kubenswrapper[4956]: I0314 09:30:07.218964 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605036d5-9d48-4852-b956-565c2659cb36" path="/var/lib/kubelet/pods/605036d5-9d48-4852-b956-565c2659cb36/volumes" Mar 14 09:30:25 crc kubenswrapper[4956]: I0314 09:30:25.423717 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:30:25 crc kubenswrapper[4956]: I0314 09:30:25.424395 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:30:55 crc kubenswrapper[4956]: I0314 09:30:55.030629 4956 scope.go:117] "RemoveContainer" containerID="1e6949f7b0a995c2b2b1709357213db06a2b48332e2fcd2e53fa73affdbc1e82" Mar 14 09:30:55 crc kubenswrapper[4956]: I0314 09:30:55.056100 4956 scope.go:117] "RemoveContainer" containerID="1d98748227b182d14e254f302151164d84ce0ff4e070d3a589d2e88cec797a80" Mar 14 09:30:55 crc kubenswrapper[4956]: I0314 09:30:55.103016 4956 scope.go:117] "RemoveContainer" containerID="ac99d14e2e0d7c316131b7e8e8c690b2aa28487343409ff55615848a1a8bf282" Mar 14 09:30:55 crc kubenswrapper[4956]: I0314 09:30:55.423564 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:30:55 crc kubenswrapper[4956]: I0314 09:30:55.423662 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.418233 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zcwjt"] Mar 14 09:31:13 crc kubenswrapper[4956]: E0314 09:31:13.424328 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25604e27-1692-429b-bb26-e24e41514227" containerName="collect-profiles" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.424378 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="25604e27-1692-429b-bb26-e24e41514227" containerName="collect-profiles" Mar 14 09:31:13 crc kubenswrapper[4956]: E0314 09:31:13.424409 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5322a936-a1ad-4a11-8264-7f6d088026ac" containerName="oc" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.424418 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5322a936-a1ad-4a11-8264-7f6d088026ac" containerName="oc" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.425069 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5322a936-a1ad-4a11-8264-7f6d088026ac" containerName="oc" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.425128 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="25604e27-1692-429b-bb26-e24e41514227" containerName="collect-profiles" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.433285 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.472017 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcwjt"] Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.632254 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-catalog-content\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.632326 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-utilities\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.632350 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kghcf\" (UniqueName: \"kubernetes.io/projected/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-kube-api-access-kghcf\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.734316 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-catalog-content\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.734393 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-utilities\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.734417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kghcf\" (UniqueName: \"kubernetes.io/projected/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-kube-api-access-kghcf\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.734999 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-catalog-content\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.734999 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-utilities\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:13 crc kubenswrapper[4956]: I0314 09:31:13.770959 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kghcf\" (UniqueName: \"kubernetes.io/projected/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-kube-api-access-kghcf\") pod \"redhat-operators-zcwjt\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:14 crc kubenswrapper[4956]: I0314 09:31:14.071607 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:14 crc kubenswrapper[4956]: I0314 09:31:14.535745 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcwjt"] Mar 14 09:31:14 crc kubenswrapper[4956]: I0314 09:31:14.787475 4956 generic.go:334] "Generic (PLEG): container finished" podID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerID="c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e" exitCode=0 Mar 14 09:31:14 crc kubenswrapper[4956]: I0314 09:31:14.787710 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerDied","Data":"c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e"} Mar 14 09:31:14 crc kubenswrapper[4956]: I0314 09:31:14.787888 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerStarted","Data":"ba7a9ddcce75c8a17a99f53feb6d9d576f30b9b17e4e9584612e8ca2318801f2"} Mar 14 09:31:15 crc kubenswrapper[4956]: I0314 09:31:15.797711 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerStarted","Data":"bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c"} Mar 14 09:31:16 crc kubenswrapper[4956]: I0314 09:31:16.808479 4956 generic.go:334] "Generic (PLEG): container finished" podID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerID="bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c" exitCode=0 Mar 14 09:31:16 crc kubenswrapper[4956]: I0314 09:31:16.808688 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerDied","Data":"bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c"} Mar 14 09:31:17 crc kubenswrapper[4956]: I0314 09:31:17.818021 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerStarted","Data":"9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2"} Mar 14 09:31:17 crc kubenswrapper[4956]: I0314 09:31:17.841335 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zcwjt" podStartSLOduration=2.406888365 podStartE2EDuration="4.841317593s" podCreationTimestamp="2026-03-14 09:31:13 +0000 UTC" firstStartedPulling="2026-03-14 09:31:14.78908969 +0000 UTC m=+2080.301781958" lastFinishedPulling="2026-03-14 09:31:17.223518928 +0000 UTC m=+2082.736211186" observedRunningTime="2026-03-14 09:31:17.83881409 +0000 UTC m=+2083.351506368" watchObservedRunningTime="2026-03-14 09:31:17.841317593 +0000 UTC m=+2083.354009861" Mar 14 09:31:24 crc kubenswrapper[4956]: I0314 09:31:24.072513 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:24 crc kubenswrapper[4956]: I0314 09:31:24.072930 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:24 crc kubenswrapper[4956]: I0314 09:31:24.126778 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:24 crc kubenswrapper[4956]: I0314 09:31:24.919243 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.423803 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.423863 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.423905 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.424601 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75c968c7158a86ccc8a0fc41e6a65ef91e9a8b27c10e35460c675ff21752f263"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.424658 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://75c968c7158a86ccc8a0fc41e6a65ef91e9a8b27c10e35460c675ff21752f263" gracePeriod=600 Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.885143 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="75c968c7158a86ccc8a0fc41e6a65ef91e9a8b27c10e35460c675ff21752f263" exitCode=0 Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.885218 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"75c968c7158a86ccc8a0fc41e6a65ef91e9a8b27c10e35460c675ff21752f263"} Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.885689 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a"} Mar 14 09:31:25 crc kubenswrapper[4956]: I0314 09:31:25.885710 4956 scope.go:117] "RemoveContainer" containerID="4f6048f75740611348a8769630ee4f52486c37013d03e5c6ef9a13b3f70782eb" Mar 14 09:31:27 crc kubenswrapper[4956]: I0314 09:31:27.605831 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcwjt"] Mar 14 09:31:27 crc kubenswrapper[4956]: I0314 09:31:27.606445 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zcwjt" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="registry-server" containerID="cri-o://9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2" gracePeriod=2 Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.706390 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.792672 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kghcf\" (UniqueName: \"kubernetes.io/projected/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-kube-api-access-kghcf\") pod \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.792778 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-utilities\") pod \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.792911 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-catalog-content\") pod \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\" (UID: \"c34b8ae9-e720-4de0-9cd3-212dff0f43f7\") " Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.794101 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-utilities" (OuterVolumeSpecName: "utilities") pod "c34b8ae9-e720-4de0-9cd3-212dff0f43f7" (UID: "c34b8ae9-e720-4de0-9cd3-212dff0f43f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.800327 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-kube-api-access-kghcf" (OuterVolumeSpecName: "kube-api-access-kghcf") pod "c34b8ae9-e720-4de0-9cd3-212dff0f43f7" (UID: "c34b8ae9-e720-4de0-9cd3-212dff0f43f7"). InnerVolumeSpecName "kube-api-access-kghcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.894768 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.894807 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kghcf\" (UniqueName: \"kubernetes.io/projected/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-kube-api-access-kghcf\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.913125 4956 generic.go:334] "Generic (PLEG): container finished" podID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerID="9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2" exitCode=0 Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.913171 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerDied","Data":"9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2"} Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.913203 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcwjt" event={"ID":"c34b8ae9-e720-4de0-9cd3-212dff0f43f7","Type":"ContainerDied","Data":"ba7a9ddcce75c8a17a99f53feb6d9d576f30b9b17e4e9584612e8ca2318801f2"} Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.913226 4956 scope.go:117] "RemoveContainer" containerID="9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.913365 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcwjt" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.929594 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c34b8ae9-e720-4de0-9cd3-212dff0f43f7" (UID: "c34b8ae9-e720-4de0-9cd3-212dff0f43f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.939307 4956 scope.go:117] "RemoveContainer" containerID="bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.960173 4956 scope.go:117] "RemoveContainer" containerID="c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.996098 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c34b8ae9-e720-4de0-9cd3-212dff0f43f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.996614 4956 scope.go:117] "RemoveContainer" containerID="9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2" Mar 14 09:31:28 crc kubenswrapper[4956]: E0314 09:31:28.997140 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2\": container with ID starting with 9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2 not found: ID does not exist" containerID="9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.997185 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2"} err="failed to get container status \"9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2\": rpc error: code = NotFound desc = could not find container \"9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2\": container with ID starting with 9877dbd56b96d57b28d7f48c50b93ea93876a08dbc0debe982d384c40e3bf5c2 not found: ID does not exist" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.997211 4956 scope.go:117] "RemoveContainer" containerID="bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c" Mar 14 09:31:28 crc kubenswrapper[4956]: E0314 09:31:28.997685 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c\": container with ID starting with bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c not found: ID does not exist" containerID="bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.997721 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c"} err="failed to get container status \"bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c\": rpc error: code = NotFound desc = could not find container \"bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c\": container with ID starting with bdeee3fd104b976b2ec20389aba403d386810da88307166d993975d6fddb460c not found: ID does not exist" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.997768 4956 scope.go:117] "RemoveContainer" containerID="c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e" Mar 14 09:31:28 crc kubenswrapper[4956]: E0314 09:31:28.998070 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e\": container with ID starting with c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e not found: ID does not exist" containerID="c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e" Mar 14 09:31:28 crc kubenswrapper[4956]: I0314 09:31:28.998100 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e"} err="failed to get container status \"c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e\": rpc error: code = NotFound desc = could not find container \"c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e\": container with ID starting with c646d956997df9967d253bbaaf231bc8c5ae33a1039dd2f8e742a5cd543ffc7e not found: ID does not exist" Mar 14 09:31:29 crc kubenswrapper[4956]: I0314 09:31:29.248900 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcwjt"] Mar 14 09:31:29 crc kubenswrapper[4956]: I0314 09:31:29.255950 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zcwjt"] Mar 14 09:31:31 crc kubenswrapper[4956]: I0314 09:31:31.219835 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" path="/var/lib/kubelet/pods/c34b8ae9-e720-4de0-9cd3-212dff0f43f7/volumes" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.142577 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558012-nsvh5"] Mar 14 09:32:00 crc kubenswrapper[4956]: E0314 09:32:00.143595 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="extract-utilities" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.143613 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="extract-utilities" Mar 14 09:32:00 crc kubenswrapper[4956]: E0314 09:32:00.143629 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="registry-server" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.143637 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="registry-server" Mar 14 09:32:00 crc kubenswrapper[4956]: E0314 09:32:00.143654 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="extract-content" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.143662 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="extract-content" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.143849 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34b8ae9-e720-4de0-9cd3-212dff0f43f7" containerName="registry-server" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.144556 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.146779 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.147352 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.147400 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.152190 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-nsvh5"] Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.211424 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmlv\" (UniqueName: \"kubernetes.io/projected/8265b6df-2302-441f-95b2-3115520e0c53-kube-api-access-jrmlv\") pod \"auto-csr-approver-29558012-nsvh5\" (UID: \"8265b6df-2302-441f-95b2-3115520e0c53\") " pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.313275 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmlv\" (UniqueName: \"kubernetes.io/projected/8265b6df-2302-441f-95b2-3115520e0c53-kube-api-access-jrmlv\") pod \"auto-csr-approver-29558012-nsvh5\" (UID: \"8265b6df-2302-441f-95b2-3115520e0c53\") " pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.340463 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmlv\" (UniqueName: \"kubernetes.io/projected/8265b6df-2302-441f-95b2-3115520e0c53-kube-api-access-jrmlv\") pod \"auto-csr-approver-29558012-nsvh5\" (UID: \"8265b6df-2302-441f-95b2-3115520e0c53\") " pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.466450 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:00 crc kubenswrapper[4956]: I0314 09:32:00.878817 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-nsvh5"] Mar 14 09:32:01 crc kubenswrapper[4956]: I0314 09:32:01.169117 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" event={"ID":"8265b6df-2302-441f-95b2-3115520e0c53","Type":"ContainerStarted","Data":"9a491826915b0042a17dc46d90d6bbb90c0dca9738650e744908ee0dd26bcd1b"} Mar 14 09:32:04 crc kubenswrapper[4956]: I0314 09:32:04.191336 4956 generic.go:334] "Generic (PLEG): container finished" podID="8265b6df-2302-441f-95b2-3115520e0c53" containerID="2edd9981f00a5617f93a39d6d35ad463add520753842bb1b43158b0a1c81cfb2" exitCode=0 Mar 14 09:32:04 crc kubenswrapper[4956]: I0314 09:32:04.191377 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" event={"ID":"8265b6df-2302-441f-95b2-3115520e0c53","Type":"ContainerDied","Data":"2edd9981f00a5617f93a39d6d35ad463add520753842bb1b43158b0a1c81cfb2"} Mar 14 09:32:05 crc kubenswrapper[4956]: I0314 09:32:05.515823 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:05 crc kubenswrapper[4956]: I0314 09:32:05.596039 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrmlv\" (UniqueName: \"kubernetes.io/projected/8265b6df-2302-441f-95b2-3115520e0c53-kube-api-access-jrmlv\") pod \"8265b6df-2302-441f-95b2-3115520e0c53\" (UID: \"8265b6df-2302-441f-95b2-3115520e0c53\") " Mar 14 09:32:05 crc kubenswrapper[4956]: I0314 09:32:05.601638 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8265b6df-2302-441f-95b2-3115520e0c53-kube-api-access-jrmlv" (OuterVolumeSpecName: "kube-api-access-jrmlv") pod "8265b6df-2302-441f-95b2-3115520e0c53" (UID: "8265b6df-2302-441f-95b2-3115520e0c53"). InnerVolumeSpecName "kube-api-access-jrmlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:32:05 crc kubenswrapper[4956]: I0314 09:32:05.697992 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrmlv\" (UniqueName: \"kubernetes.io/projected/8265b6df-2302-441f-95b2-3115520e0c53-kube-api-access-jrmlv\") on node \"crc\" DevicePath \"\"" Mar 14 09:32:06 crc kubenswrapper[4956]: I0314 09:32:06.224312 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" event={"ID":"8265b6df-2302-441f-95b2-3115520e0c53","Type":"ContainerDied","Data":"9a491826915b0042a17dc46d90d6bbb90c0dca9738650e744908ee0dd26bcd1b"} Mar 14 09:32:06 crc kubenswrapper[4956]: I0314 09:32:06.224349 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a491826915b0042a17dc46d90d6bbb90c0dca9738650e744908ee0dd26bcd1b" Mar 14 09:32:06 crc kubenswrapper[4956]: I0314 09:32:06.224369 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-nsvh5" Mar 14 09:32:06 crc kubenswrapper[4956]: I0314 09:32:06.577858 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-ms242"] Mar 14 09:32:06 crc kubenswrapper[4956]: I0314 09:32:06.583425 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-ms242"] Mar 14 09:32:07 crc kubenswrapper[4956]: I0314 09:32:07.218639 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2283b32b-1f0b-4db4-b7c7-d31b7e162806" path="/var/lib/kubelet/pods/2283b32b-1f0b-4db4-b7c7-d31b7e162806/volumes" Mar 14 09:32:55 crc kubenswrapper[4956]: I0314 09:32:55.236392 4956 scope.go:117] "RemoveContainer" containerID="869e41b959441f202198c6d10e3149b7e036d45e993b0afa9acbc3288b697396" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.853050 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59mt4"] Mar 14 09:33:16 crc kubenswrapper[4956]: E0314 09:33:16.854225 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8265b6df-2302-441f-95b2-3115520e0c53" containerName="oc" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.854240 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8265b6df-2302-441f-95b2-3115520e0c53" containerName="oc" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.854427 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8265b6df-2302-441f-95b2-3115520e0c53" containerName="oc" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.855983 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.874940 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59mt4"] Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.960887 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghp7l\" (UniqueName: \"kubernetes.io/projected/bfea906e-1872-44a1-888c-51e8a34ff58c-kube-api-access-ghp7l\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.960948 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-catalog-content\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:16 crc kubenswrapper[4956]: I0314 09:33:16.960979 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-utilities\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.063569 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghp7l\" (UniqueName: \"kubernetes.io/projected/bfea906e-1872-44a1-888c-51e8a34ff58c-kube-api-access-ghp7l\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.063647 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-catalog-content\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.063705 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-utilities\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.064268 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-utilities\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.064296 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-catalog-content\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.082419 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghp7l\" (UniqueName: \"kubernetes.io/projected/bfea906e-1872-44a1-888c-51e8a34ff58c-kube-api-access-ghp7l\") pod \"certified-operators-59mt4\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.177936 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.463505 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59mt4"] Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.769762 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerID="0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1" exitCode=0 Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.769834 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59mt4" event={"ID":"bfea906e-1872-44a1-888c-51e8a34ff58c","Type":"ContainerDied","Data":"0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1"} Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.770062 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59mt4" event={"ID":"bfea906e-1872-44a1-888c-51e8a34ff58c","Type":"ContainerStarted","Data":"14c65c4938248ae09ecb2289323e02f9ff86705413a5b9a5fd9d41549ddc466d"} Mar 14 09:33:17 crc kubenswrapper[4956]: I0314 09:33:17.772168 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:33:18 crc kubenswrapper[4956]: I0314 09:33:18.779644 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerID="411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5" exitCode=0 Mar 14 09:33:18 crc kubenswrapper[4956]: I0314 09:33:18.779738 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59mt4" event={"ID":"bfea906e-1872-44a1-888c-51e8a34ff58c","Type":"ContainerDied","Data":"411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5"} Mar 14 09:33:19 crc kubenswrapper[4956]: I0314 09:33:19.789375 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59mt4" event={"ID":"bfea906e-1872-44a1-888c-51e8a34ff58c","Type":"ContainerStarted","Data":"9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d"} Mar 14 09:33:19 crc kubenswrapper[4956]: I0314 09:33:19.811067 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59mt4" podStartSLOduration=2.400939573 podStartE2EDuration="3.811046826s" podCreationTimestamp="2026-03-14 09:33:16 +0000 UTC" firstStartedPulling="2026-03-14 09:33:17.771866555 +0000 UTC m=+2203.284558823" lastFinishedPulling="2026-03-14 09:33:19.181973768 +0000 UTC m=+2204.694666076" observedRunningTime="2026-03-14 09:33:19.80529526 +0000 UTC m=+2205.317987528" watchObservedRunningTime="2026-03-14 09:33:19.811046826 +0000 UTC m=+2205.323739114" Mar 14 09:33:25 crc kubenswrapper[4956]: I0314 09:33:25.423504 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:33:25 crc kubenswrapper[4956]: I0314 09:33:25.424120 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:33:27 crc kubenswrapper[4956]: I0314 09:33:27.178625 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:27 crc kubenswrapper[4956]: I0314 09:33:27.179408 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:27 crc kubenswrapper[4956]: I0314 09:33:27.225185 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:27 crc kubenswrapper[4956]: I0314 09:33:27.897566 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:30 crc kubenswrapper[4956]: I0314 09:33:30.828743 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59mt4"] Mar 14 09:33:30 crc kubenswrapper[4956]: I0314 09:33:30.878703 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59mt4" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="registry-server" containerID="cri-o://9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d" gracePeriod=2 Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.281687 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.394969 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-utilities\") pod \"bfea906e-1872-44a1-888c-51e8a34ff58c\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.395049 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghp7l\" (UniqueName: \"kubernetes.io/projected/bfea906e-1872-44a1-888c-51e8a34ff58c-kube-api-access-ghp7l\") pod \"bfea906e-1872-44a1-888c-51e8a34ff58c\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.395204 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-catalog-content\") pod \"bfea906e-1872-44a1-888c-51e8a34ff58c\" (UID: \"bfea906e-1872-44a1-888c-51e8a34ff58c\") " Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.395827 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-utilities" (OuterVolumeSpecName: "utilities") pod "bfea906e-1872-44a1-888c-51e8a34ff58c" (UID: "bfea906e-1872-44a1-888c-51e8a34ff58c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.400608 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfea906e-1872-44a1-888c-51e8a34ff58c-kube-api-access-ghp7l" (OuterVolumeSpecName: "kube-api-access-ghp7l") pod "bfea906e-1872-44a1-888c-51e8a34ff58c" (UID: "bfea906e-1872-44a1-888c-51e8a34ff58c"). InnerVolumeSpecName "kube-api-access-ghp7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.442714 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfea906e-1872-44a1-888c-51e8a34ff58c" (UID: "bfea906e-1872-44a1-888c-51e8a34ff58c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.497301 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.497334 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfea906e-1872-44a1-888c-51e8a34ff58c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.497344 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghp7l\" (UniqueName: \"kubernetes.io/projected/bfea906e-1872-44a1-888c-51e8a34ff58c-kube-api-access-ghp7l\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.888771 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerID="9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d" exitCode=0 Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.888820 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59mt4" event={"ID":"bfea906e-1872-44a1-888c-51e8a34ff58c","Type":"ContainerDied","Data":"9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d"} Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.888855 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59mt4" event={"ID":"bfea906e-1872-44a1-888c-51e8a34ff58c","Type":"ContainerDied","Data":"14c65c4938248ae09ecb2289323e02f9ff86705413a5b9a5fd9d41549ddc466d"} Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.888871 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59mt4" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.888885 4956 scope.go:117] "RemoveContainer" containerID="9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.907873 4956 scope.go:117] "RemoveContainer" containerID="411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.922687 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59mt4"] Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.928915 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59mt4"] Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.938429 4956 scope.go:117] "RemoveContainer" containerID="0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.958309 4956 scope.go:117] "RemoveContainer" containerID="9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d" Mar 14 09:33:31 crc kubenswrapper[4956]: E0314 09:33:31.958797 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d\": container with ID starting with 9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d not found: ID does not exist" containerID="9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.958840 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d"} err="failed to get container status \"9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d\": rpc error: code = NotFound desc = could not find container \"9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d\": container with ID starting with 9c95424003cabee36f2c41aabfbd0ab5e75ebfb3099c032bf691134138ce9c0d not found: ID does not exist" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.958866 4956 scope.go:117] "RemoveContainer" containerID="411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5" Mar 14 09:33:31 crc kubenswrapper[4956]: E0314 09:33:31.959248 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5\": container with ID starting with 411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5 not found: ID does not exist" containerID="411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.959281 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5"} err="failed to get container status \"411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5\": rpc error: code = NotFound desc = could not find container \"411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5\": container with ID starting with 411257181ba5273fb996f3f91982693b9e8e168f1682fe84ec02cfbe9f51dfe5 not found: ID does not exist" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.959303 4956 scope.go:117] "RemoveContainer" containerID="0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1" Mar 14 09:33:31 crc kubenswrapper[4956]: E0314 09:33:31.959746 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1\": container with ID starting with 0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1 not found: ID does not exist" containerID="0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1" Mar 14 09:33:31 crc kubenswrapper[4956]: I0314 09:33:31.959771 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1"} err="failed to get container status \"0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1\": rpc error: code = NotFound desc = could not find container \"0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1\": container with ID starting with 0d7ca40dde46cb5287788bce71d87aee026d469d3a61a628ed0074d0de8a81f1 not found: ID does not exist" Mar 14 09:33:33 crc kubenswrapper[4956]: I0314 09:33:33.225199 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" path="/var/lib/kubelet/pods/bfea906e-1872-44a1-888c-51e8a34ff58c/volumes" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.243286 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lhqgs"] Mar 14 09:33:39 crc kubenswrapper[4956]: E0314 09:33:39.244983 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="registry-server" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.245061 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="registry-server" Mar 14 09:33:39 crc kubenswrapper[4956]: E0314 09:33:39.245144 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="extract-utilities" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.245286 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="extract-utilities" Mar 14 09:33:39 crc kubenswrapper[4956]: E0314 09:33:39.245351 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="extract-content" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.245406 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="extract-content" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.245638 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfea906e-1872-44a1-888c-51e8a34ff58c" containerName="registry-server" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.246928 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.258066 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhqgs"] Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.324822 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-catalog-content\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.324880 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnn2\" (UniqueName: \"kubernetes.io/projected/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-kube-api-access-vcnn2\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.324965 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-utilities\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.426213 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-catalog-content\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.426272 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnn2\" (UniqueName: \"kubernetes.io/projected/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-kube-api-access-vcnn2\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.426320 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-utilities\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.426807 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-utilities\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.427030 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-catalog-content\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.448080 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnn2\" (UniqueName: \"kubernetes.io/projected/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-kube-api-access-vcnn2\") pod \"redhat-marketplace-lhqgs\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:39 crc kubenswrapper[4956]: I0314 09:33:39.566735 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:40 crc kubenswrapper[4956]: I0314 09:33:40.003028 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhqgs"] Mar 14 09:33:40 crc kubenswrapper[4956]: I0314 09:33:40.967982 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerID="f755e9990317c101293bd7339f4aa8537a85570cc4ad08d2b73a66a238165aad" exitCode=0 Mar 14 09:33:40 crc kubenswrapper[4956]: I0314 09:33:40.968077 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhqgs" event={"ID":"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb","Type":"ContainerDied","Data":"f755e9990317c101293bd7339f4aa8537a85570cc4ad08d2b73a66a238165aad"} Mar 14 09:33:40 crc kubenswrapper[4956]: I0314 09:33:40.968283 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhqgs" event={"ID":"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb","Type":"ContainerStarted","Data":"f558aa868513f53568dac28d18a58432a78e6d60babde5751fb8b2805480a3ef"} Mar 14 09:33:41 crc kubenswrapper[4956]: I0314 09:33:41.976402 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerID="ae57f6725505eacf9f5408536b02d8b676bdd128d6b309b38b91fbd3642252be" exitCode=0 Mar 14 09:33:41 crc kubenswrapper[4956]: I0314 09:33:41.976496 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhqgs" event={"ID":"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb","Type":"ContainerDied","Data":"ae57f6725505eacf9f5408536b02d8b676bdd128d6b309b38b91fbd3642252be"} Mar 14 09:33:42 crc kubenswrapper[4956]: I0314 09:33:42.991003 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhqgs" event={"ID":"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb","Type":"ContainerStarted","Data":"078d669d2df4819508d75c4da8d573b742cbd2f99baf478951163df58e6899ae"} Mar 14 09:33:43 crc kubenswrapper[4956]: I0314 09:33:43.019921 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lhqgs" podStartSLOduration=2.615708495 podStartE2EDuration="4.019900017s" podCreationTimestamp="2026-03-14 09:33:39 +0000 UTC" firstStartedPulling="2026-03-14 09:33:40.970166467 +0000 UTC m=+2226.482858735" lastFinishedPulling="2026-03-14 09:33:42.374357979 +0000 UTC m=+2227.887050257" observedRunningTime="2026-03-14 09:33:43.017078075 +0000 UTC m=+2228.529770353" watchObservedRunningTime="2026-03-14 09:33:43.019900017 +0000 UTC m=+2228.532592285" Mar 14 09:33:49 crc kubenswrapper[4956]: I0314 09:33:49.567755 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:49 crc kubenswrapper[4956]: I0314 09:33:49.568778 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:49 crc kubenswrapper[4956]: I0314 09:33:49.643450 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:50 crc kubenswrapper[4956]: I0314 09:33:50.116808 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:53 crc kubenswrapper[4956]: I0314 09:33:53.636197 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhqgs"] Mar 14 09:33:53 crc kubenswrapper[4956]: I0314 09:33:53.636807 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lhqgs" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="registry-server" containerID="cri-o://078d669d2df4819508d75c4da8d573b742cbd2f99baf478951163df58e6899ae" gracePeriod=2 Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.091274 4956 generic.go:334] "Generic (PLEG): container finished" podID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerID="078d669d2df4819508d75c4da8d573b742cbd2f99baf478951163df58e6899ae" exitCode=0 Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.091380 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhqgs" event={"ID":"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb","Type":"ContainerDied","Data":"078d669d2df4819508d75c4da8d573b742cbd2f99baf478951163df58e6899ae"} Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.091649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhqgs" event={"ID":"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb","Type":"ContainerDied","Data":"f558aa868513f53568dac28d18a58432a78e6d60babde5751fb8b2805480a3ef"} Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.091669 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f558aa868513f53568dac28d18a58432a78e6d60babde5751fb8b2805480a3ef" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.095178 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.182304 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-utilities\") pod \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.182401 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-catalog-content\") pod \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.182423 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcnn2\" (UniqueName: \"kubernetes.io/projected/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-kube-api-access-vcnn2\") pod \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\" (UID: \"0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb\") " Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.183594 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-utilities" (OuterVolumeSpecName: "utilities") pod "0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" (UID: "0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.189006 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-kube-api-access-vcnn2" (OuterVolumeSpecName: "kube-api-access-vcnn2") pod "0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" (UID: "0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb"). InnerVolumeSpecName "kube-api-access-vcnn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.206022 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" (UID: "0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.284873 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.284940 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:54 crc kubenswrapper[4956]: I0314 09:33:54.284966 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcnn2\" (UniqueName: \"kubernetes.io/projected/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb-kube-api-access-vcnn2\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:55 crc kubenswrapper[4956]: I0314 09:33:55.103581 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhqgs" Mar 14 09:33:55 crc kubenswrapper[4956]: I0314 09:33:55.147913 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhqgs"] Mar 14 09:33:55 crc kubenswrapper[4956]: I0314 09:33:55.165427 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhqgs"] Mar 14 09:33:55 crc kubenswrapper[4956]: I0314 09:33:55.223260 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" path="/var/lib/kubelet/pods/0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb/volumes" Mar 14 09:33:55 crc kubenswrapper[4956]: I0314 09:33:55.423671 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:33:55 crc kubenswrapper[4956]: I0314 09:33:55.424180 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.151589 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558014-bkvpt"] Mar 14 09:34:00 crc kubenswrapper[4956]: E0314 09:34:00.153426 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="registry-server" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.153562 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="registry-server" Mar 14 09:34:00 crc kubenswrapper[4956]: E0314 09:34:00.153648 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="extract-content" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.153724 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="extract-content" Mar 14 09:34:00 crc kubenswrapper[4956]: E0314 09:34:00.153811 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="extract-utilities" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.153880 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="extract-utilities" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.154205 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7bd7bc-66b3-423d-b1a6-e1fa38e5c9cb" containerName="registry-server" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.154990 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.157522 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.159068 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.159879 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.161516 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-bkvpt"] Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.178268 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9n2\" (UniqueName: \"kubernetes.io/projected/8250654a-de53-49bc-91c8-bda36567251c-kube-api-access-lq9n2\") pod \"auto-csr-approver-29558014-bkvpt\" (UID: \"8250654a-de53-49bc-91c8-bda36567251c\") " pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.280221 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9n2\" (UniqueName: \"kubernetes.io/projected/8250654a-de53-49bc-91c8-bda36567251c-kube-api-access-lq9n2\") pod \"auto-csr-approver-29558014-bkvpt\" (UID: \"8250654a-de53-49bc-91c8-bda36567251c\") " pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.299232 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9n2\" (UniqueName: \"kubernetes.io/projected/8250654a-de53-49bc-91c8-bda36567251c-kube-api-access-lq9n2\") pod \"auto-csr-approver-29558014-bkvpt\" (UID: \"8250654a-de53-49bc-91c8-bda36567251c\") " pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.489010 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:00 crc kubenswrapper[4956]: I0314 09:34:00.917400 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-bkvpt"] Mar 14 09:34:01 crc kubenswrapper[4956]: I0314 09:34:01.159948 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" event={"ID":"8250654a-de53-49bc-91c8-bda36567251c","Type":"ContainerStarted","Data":"6f55eb7b6a59a2640343fc578ad8baad6745d5e29f32bf37c5ca05c267ddf8bf"} Mar 14 09:34:02 crc kubenswrapper[4956]: I0314 09:34:02.169619 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" event={"ID":"8250654a-de53-49bc-91c8-bda36567251c","Type":"ContainerStarted","Data":"853facb268109455932b741d6f99159af45f2ed18a4d785e29d67356b80c9788"} Mar 14 09:34:02 crc kubenswrapper[4956]: I0314 09:34:02.187615 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" podStartSLOduration=1.373623926 podStartE2EDuration="2.187598009s" podCreationTimestamp="2026-03-14 09:34:00 +0000 UTC" firstStartedPulling="2026-03-14 09:34:00.924376403 +0000 UTC m=+2246.437068671" lastFinishedPulling="2026-03-14 09:34:01.738350486 +0000 UTC m=+2247.251042754" observedRunningTime="2026-03-14 09:34:02.186919751 +0000 UTC m=+2247.699612019" watchObservedRunningTime="2026-03-14 09:34:02.187598009 +0000 UTC m=+2247.700290267" Mar 14 09:34:03 crc kubenswrapper[4956]: I0314 09:34:03.182254 4956 generic.go:334] "Generic (PLEG): container finished" podID="8250654a-de53-49bc-91c8-bda36567251c" containerID="853facb268109455932b741d6f99159af45f2ed18a4d785e29d67356b80c9788" exitCode=0 Mar 14 09:34:03 crc kubenswrapper[4956]: I0314 09:34:03.182409 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" event={"ID":"8250654a-de53-49bc-91c8-bda36567251c","Type":"ContainerDied","Data":"853facb268109455932b741d6f99159af45f2ed18a4d785e29d67356b80c9788"} Mar 14 09:34:04 crc kubenswrapper[4956]: I0314 09:34:04.477539 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:04 crc kubenswrapper[4956]: I0314 09:34:04.648330 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9n2\" (UniqueName: \"kubernetes.io/projected/8250654a-de53-49bc-91c8-bda36567251c-kube-api-access-lq9n2\") pod \"8250654a-de53-49bc-91c8-bda36567251c\" (UID: \"8250654a-de53-49bc-91c8-bda36567251c\") " Mar 14 09:34:04 crc kubenswrapper[4956]: I0314 09:34:04.654288 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8250654a-de53-49bc-91c8-bda36567251c-kube-api-access-lq9n2" (OuterVolumeSpecName: "kube-api-access-lq9n2") pod "8250654a-de53-49bc-91c8-bda36567251c" (UID: "8250654a-de53-49bc-91c8-bda36567251c"). InnerVolumeSpecName "kube-api-access-lq9n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:34:04 crc kubenswrapper[4956]: I0314 09:34:04.750585 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9n2\" (UniqueName: \"kubernetes.io/projected/8250654a-de53-49bc-91c8-bda36567251c-kube-api-access-lq9n2\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:05 crc kubenswrapper[4956]: I0314 09:34:05.198809 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" event={"ID":"8250654a-de53-49bc-91c8-bda36567251c","Type":"ContainerDied","Data":"6f55eb7b6a59a2640343fc578ad8baad6745d5e29f32bf37c5ca05c267ddf8bf"} Mar 14 09:34:05 crc kubenswrapper[4956]: I0314 09:34:05.198846 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f55eb7b6a59a2640343fc578ad8baad6745d5e29f32bf37c5ca05c267ddf8bf" Mar 14 09:34:05 crc kubenswrapper[4956]: I0314 09:34:05.198902 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-bkvpt" Mar 14 09:34:05 crc kubenswrapper[4956]: I0314 09:34:05.260864 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-vhbd6"] Mar 14 09:34:05 crc kubenswrapper[4956]: I0314 09:34:05.266996 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-vhbd6"] Mar 14 09:34:07 crc kubenswrapper[4956]: I0314 09:34:07.218560 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2944c10a-78bd-49e1-a6a0-99db82d462f5" path="/var/lib/kubelet/pods/2944c10a-78bd-49e1-a6a0-99db82d462f5/volumes" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.656451 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4ftnd"] Mar 14 09:34:11 crc kubenswrapper[4956]: E0314 09:34:11.657985 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8250654a-de53-49bc-91c8-bda36567251c" containerName="oc" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.658006 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8250654a-de53-49bc-91c8-bda36567251c" containerName="oc" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.658454 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8250654a-de53-49bc-91c8-bda36567251c" containerName="oc" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.661432 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.753861 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4ftnd"] Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.770553 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-utilities\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.770601 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-catalog-content\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.770621 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmm4\" (UniqueName: \"kubernetes.io/projected/e4c20536-7aa0-401c-9f83-194ba4c0a149-kube-api-access-kvmm4\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.872334 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-utilities\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.872393 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-catalog-content\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.872426 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmm4\" (UniqueName: \"kubernetes.io/projected/e4c20536-7aa0-401c-9f83-194ba4c0a149-kube-api-access-kvmm4\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.873085 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-catalog-content\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.873253 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-utilities\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:11 crc kubenswrapper[4956]: I0314 09:34:11.890183 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmm4\" (UniqueName: \"kubernetes.io/projected/e4c20536-7aa0-401c-9f83-194ba4c0a149-kube-api-access-kvmm4\") pod \"community-operators-4ftnd\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:12 crc kubenswrapper[4956]: I0314 09:34:12.017005 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:12 crc kubenswrapper[4956]: I0314 09:34:12.331235 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4ftnd"] Mar 14 09:34:13 crc kubenswrapper[4956]: I0314 09:34:13.279343 4956 generic.go:334] "Generic (PLEG): container finished" podID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerID="52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c" exitCode=0 Mar 14 09:34:13 crc kubenswrapper[4956]: I0314 09:34:13.279434 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ftnd" event={"ID":"e4c20536-7aa0-401c-9f83-194ba4c0a149","Type":"ContainerDied","Data":"52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c"} Mar 14 09:34:13 crc kubenswrapper[4956]: I0314 09:34:13.279661 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ftnd" event={"ID":"e4c20536-7aa0-401c-9f83-194ba4c0a149","Type":"ContainerStarted","Data":"dd2be64d333ef2a53679a4ed5747b236f90ecbdfdec4bf2c053160a466aa05ef"} Mar 14 09:34:14 crc kubenswrapper[4956]: I0314 09:34:14.289400 4956 generic.go:334] "Generic (PLEG): container finished" podID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerID="5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275" exitCode=0 Mar 14 09:34:14 crc kubenswrapper[4956]: I0314 09:34:14.289504 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ftnd" event={"ID":"e4c20536-7aa0-401c-9f83-194ba4c0a149","Type":"ContainerDied","Data":"5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275"} Mar 14 09:34:15 crc kubenswrapper[4956]: I0314 09:34:15.312870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ftnd" event={"ID":"e4c20536-7aa0-401c-9f83-194ba4c0a149","Type":"ContainerStarted","Data":"04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89"} Mar 14 09:34:15 crc kubenswrapper[4956]: I0314 09:34:15.332958 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4ftnd" podStartSLOduration=2.944173 podStartE2EDuration="4.33294263s" podCreationTimestamp="2026-03-14 09:34:11 +0000 UTC" firstStartedPulling="2026-03-14 09:34:13.282273157 +0000 UTC m=+2258.794965425" lastFinishedPulling="2026-03-14 09:34:14.671042787 +0000 UTC m=+2260.183735055" observedRunningTime="2026-03-14 09:34:15.332753096 +0000 UTC m=+2260.845445374" watchObservedRunningTime="2026-03-14 09:34:15.33294263 +0000 UTC m=+2260.845634898" Mar 14 09:34:22 crc kubenswrapper[4956]: I0314 09:34:22.017423 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:22 crc kubenswrapper[4956]: I0314 09:34:22.018061 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:22 crc kubenswrapper[4956]: I0314 09:34:22.064602 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:22 crc kubenswrapper[4956]: I0314 09:34:22.407186 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.424092 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.424888 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.425020 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.426582 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.426730 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" gracePeriod=600 Mar 14 09:34:25 crc kubenswrapper[4956]: E0314 09:34:25.554234 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.632474 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4ftnd"] Mar 14 09:34:25 crc kubenswrapper[4956]: I0314 09:34:25.633162 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4ftnd" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="registry-server" containerID="cri-o://04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89" gracePeriod=2 Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.135414 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.301842 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-catalog-content\") pod \"e4c20536-7aa0-401c-9f83-194ba4c0a149\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.301959 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-utilities\") pod \"e4c20536-7aa0-401c-9f83-194ba4c0a149\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.302123 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvmm4\" (UniqueName: \"kubernetes.io/projected/e4c20536-7aa0-401c-9f83-194ba4c0a149-kube-api-access-kvmm4\") pod \"e4c20536-7aa0-401c-9f83-194ba4c0a149\" (UID: \"e4c20536-7aa0-401c-9f83-194ba4c0a149\") " Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.308991 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-utilities" (OuterVolumeSpecName: "utilities") pod "e4c20536-7aa0-401c-9f83-194ba4c0a149" (UID: "e4c20536-7aa0-401c-9f83-194ba4c0a149"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.312811 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c20536-7aa0-401c-9f83-194ba4c0a149-kube-api-access-kvmm4" (OuterVolumeSpecName: "kube-api-access-kvmm4") pod "e4c20536-7aa0-401c-9f83-194ba4c0a149" (UID: "e4c20536-7aa0-401c-9f83-194ba4c0a149"). InnerVolumeSpecName "kube-api-access-kvmm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.360206 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4c20536-7aa0-401c-9f83-194ba4c0a149" (UID: "e4c20536-7aa0-401c-9f83-194ba4c0a149"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.402970 4956 generic.go:334] "Generic (PLEG): container finished" podID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerID="04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89" exitCode=0 Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403050 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ftnd" event={"ID":"e4c20536-7aa0-401c-9f83-194ba4c0a149","Type":"ContainerDied","Data":"04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89"} Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403058 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ftnd" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403080 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ftnd" event={"ID":"e4c20536-7aa0-401c-9f83-194ba4c0a149","Type":"ContainerDied","Data":"dd2be64d333ef2a53679a4ed5747b236f90ecbdfdec4bf2c053160a466aa05ef"} Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403097 4956 scope.go:117] "RemoveContainer" containerID="04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403742 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403779 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4c20536-7aa0-401c-9f83-194ba4c0a149-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.403794 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvmm4\" (UniqueName: \"kubernetes.io/projected/e4c20536-7aa0-401c-9f83-194ba4c0a149-kube-api-access-kvmm4\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.408099 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" exitCode=0 Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.408145 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a"} Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.408717 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:34:26 crc kubenswrapper[4956]: E0314 09:34:26.409054 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.433193 4956 scope.go:117] "RemoveContainer" containerID="5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.444879 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4ftnd"] Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.454312 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4ftnd"] Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.465067 4956 scope.go:117] "RemoveContainer" containerID="52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.485368 4956 scope.go:117] "RemoveContainer" containerID="04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89" Mar 14 09:34:26 crc kubenswrapper[4956]: E0314 09:34:26.486034 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89\": container with ID starting with 04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89 not found: ID does not exist" containerID="04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.486089 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89"} err="failed to get container status \"04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89\": rpc error: code = NotFound desc = could not find container \"04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89\": container with ID starting with 04782477c33e2db271ad4d29862550f39ab71cb266f4c3a173332f8dbd366d89 not found: ID does not exist" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.486126 4956 scope.go:117] "RemoveContainer" containerID="5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275" Mar 14 09:34:26 crc kubenswrapper[4956]: E0314 09:34:26.486998 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275\": container with ID starting with 5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275 not found: ID does not exist" containerID="5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.487033 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275"} err="failed to get container status \"5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275\": rpc error: code = NotFound desc = could not find container \"5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275\": container with ID starting with 5a5e63ccb1334a52ad58dec0ae5c2ad71a869cc33bfbd75c01e5790921ab4275 not found: ID does not exist" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.487061 4956 scope.go:117] "RemoveContainer" containerID="52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c" Mar 14 09:34:26 crc kubenswrapper[4956]: E0314 09:34:26.488581 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c\": container with ID starting with 52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c not found: ID does not exist" containerID="52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.488631 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c"} err="failed to get container status \"52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c\": rpc error: code = NotFound desc = could not find container \"52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c\": container with ID starting with 52b79f9c3399c7fd2c512331270aabf10c7c949e2ef8438d615e52070e8b984c not found: ID does not exist" Mar 14 09:34:26 crc kubenswrapper[4956]: I0314 09:34:26.488658 4956 scope.go:117] "RemoveContainer" containerID="75c968c7158a86ccc8a0fc41e6a65ef91e9a8b27c10e35460c675ff21752f263" Mar 14 09:34:27 crc kubenswrapper[4956]: I0314 09:34:27.219657 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" path="/var/lib/kubelet/pods/e4c20536-7aa0-401c-9f83-194ba4c0a149/volumes" Mar 14 09:34:37 crc kubenswrapper[4956]: I0314 09:34:37.210712 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:34:37 crc kubenswrapper[4956]: E0314 09:34:37.211614 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:34:52 crc kubenswrapper[4956]: I0314 09:34:52.209233 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:34:52 crc kubenswrapper[4956]: E0314 09:34:52.210017 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:34:55 crc kubenswrapper[4956]: I0314 09:34:55.350157 4956 scope.go:117] "RemoveContainer" containerID="c0a5cff7e539be9cad9438a4453c71493a7597a9f8bb338ab357327b2a140b30" Mar 14 09:35:04 crc kubenswrapper[4956]: I0314 09:35:03.209315 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:35:04 crc kubenswrapper[4956]: E0314 09:35:03.209927 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.798238 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.799256 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="d63e9247-2a6e-4255-ac39-a1aa02803da8" containerName="watcher-applier" containerID="cri-o://a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328" gracePeriod=30 Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.806416 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher0215-account-delete-b6zvh"] Mar 14 09:35:10 crc kubenswrapper[4956]: E0314 09:35:10.806850 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="extract-utilities" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.806872 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="extract-utilities" Mar 14 09:35:10 crc kubenswrapper[4956]: E0314 09:35:10.806885 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="registry-server" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.806893 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="registry-server" Mar 14 09:35:10 crc kubenswrapper[4956]: E0314 09:35:10.806904 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="extract-content" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.806915 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="extract-content" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.807107 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c20536-7aa0-401c-9f83-194ba4c0a149" containerName="registry-server" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.807821 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.827560 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.827769 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-kuttl-api-log" containerID="cri-o://1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b" gracePeriod=30 Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.829065 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-api" containerID="cri-o://f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc" gracePeriod=30 Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.847343 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher0215-account-delete-b6zvh"] Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.856906 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p659d\" (UniqueName: \"kubernetes.io/projected/5c32b21a-7cb6-4432-8ba9-beb695c5423c-kube-api-access-p659d\") pod \"watcher0215-account-delete-b6zvh\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.857199 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c32b21a-7cb6-4432-8ba9-beb695c5423c-operator-scripts\") pod \"watcher0215-account-delete-b6zvh\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.927704 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.927914 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" containerName="watcher-decision-engine" containerID="cri-o://4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504" gracePeriod=30 Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.958921 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p659d\" (UniqueName: \"kubernetes.io/projected/5c32b21a-7cb6-4432-8ba9-beb695c5423c-kube-api-access-p659d\") pod \"watcher0215-account-delete-b6zvh\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.958984 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c32b21a-7cb6-4432-8ba9-beb695c5423c-operator-scripts\") pod \"watcher0215-account-delete-b6zvh\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:10 crc kubenswrapper[4956]: I0314 09:35:10.959804 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c32b21a-7cb6-4432-8ba9-beb695c5423c-operator-scripts\") pod \"watcher0215-account-delete-b6zvh\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:11 crc kubenswrapper[4956]: I0314 09:35:11.001322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p659d\" (UniqueName: \"kubernetes.io/projected/5c32b21a-7cb6-4432-8ba9-beb695c5423c-kube-api-access-p659d\") pod \"watcher0215-account-delete-b6zvh\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:11 crc kubenswrapper[4956]: I0314 09:35:11.129075 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:11 crc kubenswrapper[4956]: I0314 09:35:11.652297 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher0215-account-delete-b6zvh"] Mar 14 09:35:11 crc kubenswrapper[4956]: I0314 09:35:11.791934 4956 generic.go:334] "Generic (PLEG): container finished" podID="6f47c104-0d0a-4883-b925-0472823393b7" containerID="1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b" exitCode=143 Mar 14 09:35:11 crc kubenswrapper[4956]: I0314 09:35:11.792019 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6f47c104-0d0a-4883-b925-0472823393b7","Type":"ContainerDied","Data":"1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b"} Mar 14 09:35:11 crc kubenswrapper[4956]: I0314 09:35:11.794500 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" event={"ID":"5c32b21a-7cb6-4432-8ba9-beb695c5423c","Type":"ContainerStarted","Data":"f552d53c3e8ded3cceaaf23300e0036c1b55331ff6ad2b60ff95d210a739aff1"} Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.232338 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.382309 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-custom-prometheus-ca\") pod \"6f47c104-0d0a-4883-b925-0472823393b7\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.382369 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f47c104-0d0a-4883-b925-0472823393b7-logs\") pod \"6f47c104-0d0a-4883-b925-0472823393b7\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.382391 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65wl8\" (UniqueName: \"kubernetes.io/projected/6f47c104-0d0a-4883-b925-0472823393b7-kube-api-access-65wl8\") pod \"6f47c104-0d0a-4883-b925-0472823393b7\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.382440 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-config-data\") pod \"6f47c104-0d0a-4883-b925-0472823393b7\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.382509 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-combined-ca-bundle\") pod \"6f47c104-0d0a-4883-b925-0472823393b7\" (UID: \"6f47c104-0d0a-4883-b925-0472823393b7\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.385850 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f47c104-0d0a-4883-b925-0472823393b7-logs" (OuterVolumeSpecName: "logs") pod "6f47c104-0d0a-4883-b925-0472823393b7" (UID: "6f47c104-0d0a-4883-b925-0472823393b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.391465 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f47c104-0d0a-4883-b925-0472823393b7-kube-api-access-65wl8" (OuterVolumeSpecName: "kube-api-access-65wl8") pod "6f47c104-0d0a-4883-b925-0472823393b7" (UID: "6f47c104-0d0a-4883-b925-0472823393b7"). InnerVolumeSpecName "kube-api-access-65wl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.416423 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f47c104-0d0a-4883-b925-0472823393b7" (UID: "6f47c104-0d0a-4883-b925-0472823393b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.417318 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6f47c104-0d0a-4883-b925-0472823393b7" (UID: "6f47c104-0d0a-4883-b925-0472823393b7"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.450441 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-config-data" (OuterVolumeSpecName: "config-data") pod "6f47c104-0d0a-4883-b925-0472823393b7" (UID: "6f47c104-0d0a-4883-b925-0472823393b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.484662 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.484696 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f47c104-0d0a-4883-b925-0472823393b7-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.484706 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65wl8\" (UniqueName: \"kubernetes.io/projected/6f47c104-0d0a-4883-b925-0472823393b7-kube-api-access-65wl8\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.484716 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.484725 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f47c104-0d0a-4883-b925-0472823393b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.675223 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.788379 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zft4\" (UniqueName: \"kubernetes.io/projected/d63e9247-2a6e-4255-ac39-a1aa02803da8-kube-api-access-5zft4\") pod \"d63e9247-2a6e-4255-ac39-a1aa02803da8\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.788472 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d63e9247-2a6e-4255-ac39-a1aa02803da8-logs\") pod \"d63e9247-2a6e-4255-ac39-a1aa02803da8\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.788531 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-config-data\") pod \"d63e9247-2a6e-4255-ac39-a1aa02803da8\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.788673 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-combined-ca-bundle\") pod \"d63e9247-2a6e-4255-ac39-a1aa02803da8\" (UID: \"d63e9247-2a6e-4255-ac39-a1aa02803da8\") " Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.789302 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d63e9247-2a6e-4255-ac39-a1aa02803da8-logs" (OuterVolumeSpecName: "logs") pod "d63e9247-2a6e-4255-ac39-a1aa02803da8" (UID: "d63e9247-2a6e-4255-ac39-a1aa02803da8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.793030 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63e9247-2a6e-4255-ac39-a1aa02803da8-kube-api-access-5zft4" (OuterVolumeSpecName: "kube-api-access-5zft4") pod "d63e9247-2a6e-4255-ac39-a1aa02803da8" (UID: "d63e9247-2a6e-4255-ac39-a1aa02803da8"). InnerVolumeSpecName "kube-api-access-5zft4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.815342 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d63e9247-2a6e-4255-ac39-a1aa02803da8" (UID: "d63e9247-2a6e-4255-ac39-a1aa02803da8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.815541 4956 generic.go:334] "Generic (PLEG): container finished" podID="6f47c104-0d0a-4883-b925-0472823393b7" containerID="f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc" exitCode=0 Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.815567 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6f47c104-0d0a-4883-b925-0472823393b7","Type":"ContainerDied","Data":"f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc"} Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.815770 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6f47c104-0d0a-4883-b925-0472823393b7","Type":"ContainerDied","Data":"e63fcdf110089e373f93a5c295c7873bf894bc0259a7d6f56c6ba66346649d89"} Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.815619 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.815788 4956 scope.go:117] "RemoveContainer" containerID="f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.818069 4956 generic.go:334] "Generic (PLEG): container finished" podID="5c32b21a-7cb6-4432-8ba9-beb695c5423c" containerID="1fa6ef6d1e7d6f3331899d16439ca90badd0f43a14f358294827f8d568e81a60" exitCode=0 Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.818134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" event={"ID":"5c32b21a-7cb6-4432-8ba9-beb695c5423c","Type":"ContainerDied","Data":"1fa6ef6d1e7d6f3331899d16439ca90badd0f43a14f358294827f8d568e81a60"} Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.820013 4956 generic.go:334] "Generic (PLEG): container finished" podID="d63e9247-2a6e-4255-ac39-a1aa02803da8" containerID="a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328" exitCode=0 Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.820056 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d63e9247-2a6e-4255-ac39-a1aa02803da8","Type":"ContainerDied","Data":"a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328"} Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.820071 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.820081 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d63e9247-2a6e-4255-ac39-a1aa02803da8","Type":"ContainerDied","Data":"94d68e89d45e187cbd66c47867521c1f6819ce444d75692811b3fdaef3b6f581"} Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.829563 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-config-data" (OuterVolumeSpecName: "config-data") pod "d63e9247-2a6e-4255-ac39-a1aa02803da8" (UID: "d63e9247-2a6e-4255-ac39-a1aa02803da8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.871656 4956 scope.go:117] "RemoveContainer" containerID="1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.886293 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.888346 4956 scope.go:117] "RemoveContainer" containerID="f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc" Mar 14 09:35:12 crc kubenswrapper[4956]: E0314 09:35:12.888786 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc\": container with ID starting with f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc not found: ID does not exist" containerID="f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.888817 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc"} err="failed to get container status \"f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc\": rpc error: code = NotFound desc = could not find container \"f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc\": container with ID starting with f96200014d53288f19bd2fa5a858db7dbef04d0586cbcf3d0c158a1d586c2acc not found: ID does not exist" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.888837 4956 scope.go:117] "RemoveContainer" containerID="1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b" Mar 14 09:35:12 crc kubenswrapper[4956]: E0314 09:35:12.889738 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b\": container with ID starting with 1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b not found: ID does not exist" containerID="1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.889751 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.889762 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b"} err="failed to get container status \"1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b\": rpc error: code = NotFound desc = could not find container \"1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b\": container with ID starting with 1be7ddb3925357aa7ba060261946518bd501b1ecea30185210568278e40fce2b not found: ID does not exist" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.889776 4956 scope.go:117] "RemoveContainer" containerID="a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.889766 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zft4\" (UniqueName: \"kubernetes.io/projected/d63e9247-2a6e-4255-ac39-a1aa02803da8-kube-api-access-5zft4\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.889838 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d63e9247-2a6e-4255-ac39-a1aa02803da8-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.889852 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63e9247-2a6e-4255-ac39-a1aa02803da8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.893687 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.907456 4956 scope.go:117] "RemoveContainer" containerID="a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328" Mar 14 09:35:12 crc kubenswrapper[4956]: E0314 09:35:12.908632 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328\": container with ID starting with a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328 not found: ID does not exist" containerID="a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328" Mar 14 09:35:12 crc kubenswrapper[4956]: I0314 09:35:12.908674 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328"} err="failed to get container status \"a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328\": rpc error: code = NotFound desc = could not find container \"a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328\": container with ID starting with a4ffbf458bbeaf137fefd07065930db59f04fdcf31235101f3a58eeeb9254328 not found: ID does not exist" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.150422 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.157180 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.218124 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f47c104-0d0a-4883-b925-0472823393b7" path="/var/lib/kubelet/pods/6f47c104-0d0a-4883-b925-0472823393b7/volumes" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.218747 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63e9247-2a6e-4255-ac39-a1aa02803da8" path="/var/lib/kubelet/pods/d63e9247-2a6e-4255-ac39-a1aa02803da8/volumes" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.431114 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.431559 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-central-agent" containerID="cri-o://a0f8aea5278cb520a8596eae7eae982cd216c038cc00ecc5ddcd3e35fa2a3c86" gracePeriod=30 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.431622 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="sg-core" containerID="cri-o://09aeb4bfbce8f0d48ce429c9cc00a5331e7d4a59df31b198b658c3f4eaa4cd69" gracePeriod=30 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.431613 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="proxy-httpd" containerID="cri-o://82b1c08c4310908e157d7b46968791121fb4a4f7244aba0bfe85914f96aeeed9" gracePeriod=30 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.431735 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-notification-agent" containerID="cri-o://aba7acd55c2c4b35fb09bbd85eaf66602b8293d09926544e38c03ab64a66c9e7" gracePeriod=30 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.748853 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.829269 4956 generic.go:334] "Generic (PLEG): container finished" podID="14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" containerID="4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504" exitCode=0 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.829394 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.830031 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead","Type":"ContainerDied","Data":"4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504"} Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.830058 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead","Type":"ContainerDied","Data":"46fbf52f8dcc9256e007fb248878ef41a20140a1796f435286a5ac06f24064d8"} Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.830074 4956 scope.go:117] "RemoveContainer" containerID="4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.840227 4956 generic.go:334] "Generic (PLEG): container finished" podID="216f8c56-27d3-42da-9559-d58862c4f84a" containerID="82b1c08c4310908e157d7b46968791121fb4a4f7244aba0bfe85914f96aeeed9" exitCode=0 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.840271 4956 generic.go:334] "Generic (PLEG): container finished" podID="216f8c56-27d3-42da-9559-d58862c4f84a" containerID="09aeb4bfbce8f0d48ce429c9cc00a5331e7d4a59df31b198b658c3f4eaa4cd69" exitCode=2 Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.840305 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerDied","Data":"82b1c08c4310908e157d7b46968791121fb4a4f7244aba0bfe85914f96aeeed9"} Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.840358 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerDied","Data":"09aeb4bfbce8f0d48ce429c9cc00a5331e7d4a59df31b198b658c3f4eaa4cd69"} Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.857714 4956 scope.go:117] "RemoveContainer" containerID="4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504" Mar 14 09:35:13 crc kubenswrapper[4956]: E0314 09:35:13.858805 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504\": container with ID starting with 4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504 not found: ID does not exist" containerID="4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.858848 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504"} err="failed to get container status \"4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504\": rpc error: code = NotFound desc = could not find container \"4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504\": container with ID starting with 4f685b8d4a901b33cf3e4f30a9033e489e71693ba0f234d60ca7b4694a11f504 not found: ID does not exist" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.905120 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-custom-prometheus-ca\") pod \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.905186 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-config-data\") pod \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.905257 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxs2\" (UniqueName: \"kubernetes.io/projected/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-kube-api-access-kmxs2\") pod \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.905357 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-combined-ca-bundle\") pod \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.905416 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-logs\") pod \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\" (UID: \"14c1f6f8-a23d-43a4-8b33-9c6aa8691ead\") " Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.906254 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-logs" (OuterVolumeSpecName: "logs") pod "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" (UID: "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.915105 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-kube-api-access-kmxs2" (OuterVolumeSpecName: "kube-api-access-kmxs2") pod "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" (UID: "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead"). InnerVolumeSpecName "kube-api-access-kmxs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.932652 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" (UID: "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.941139 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" (UID: "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:13 crc kubenswrapper[4956]: I0314 09:35:13.954851 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-config-data" (OuterVolumeSpecName: "config-data") pod "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" (UID: "14c1f6f8-a23d-43a4-8b33-9c6aa8691ead"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.007060 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxs2\" (UniqueName: \"kubernetes.io/projected/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-kube-api-access-kmxs2\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.007094 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.007106 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.007119 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.007128 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.143366 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.178069 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.188837 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.310988 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c32b21a-7cb6-4432-8ba9-beb695c5423c-operator-scripts\") pod \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.311131 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p659d\" (UniqueName: \"kubernetes.io/projected/5c32b21a-7cb6-4432-8ba9-beb695c5423c-kube-api-access-p659d\") pod \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\" (UID: \"5c32b21a-7cb6-4432-8ba9-beb695c5423c\") " Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.311767 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c32b21a-7cb6-4432-8ba9-beb695c5423c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c32b21a-7cb6-4432-8ba9-beb695c5423c" (UID: "5c32b21a-7cb6-4432-8ba9-beb695c5423c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.312678 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c32b21a-7cb6-4432-8ba9-beb695c5423c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.316780 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c32b21a-7cb6-4432-8ba9-beb695c5423c-kube-api-access-p659d" (OuterVolumeSpecName: "kube-api-access-p659d") pod "5c32b21a-7cb6-4432-8ba9-beb695c5423c" (UID: "5c32b21a-7cb6-4432-8ba9-beb695c5423c"). InnerVolumeSpecName "kube-api-access-p659d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.414119 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p659d\" (UniqueName: \"kubernetes.io/projected/5c32b21a-7cb6-4432-8ba9-beb695c5423c-kube-api-access-p659d\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.849052 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" event={"ID":"5c32b21a-7cb6-4432-8ba9-beb695c5423c","Type":"ContainerDied","Data":"f552d53c3e8ded3cceaaf23300e0036c1b55331ff6ad2b60ff95d210a739aff1"} Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.849381 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f552d53c3e8ded3cceaaf23300e0036c1b55331ff6ad2b60ff95d210a739aff1" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.849069 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher0215-account-delete-b6zvh" Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.857090 4956 generic.go:334] "Generic (PLEG): container finished" podID="216f8c56-27d3-42da-9559-d58862c4f84a" containerID="a0f8aea5278cb520a8596eae7eae982cd216c038cc00ecc5ddcd3e35fa2a3c86" exitCode=0 Mar 14 09:35:14 crc kubenswrapper[4956]: I0314 09:35:14.857141 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerDied","Data":"a0f8aea5278cb520a8596eae7eae982cd216c038cc00ecc5ddcd3e35fa2a3c86"} Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.219270 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" path="/var/lib/kubelet/pods/14c1f6f8-a23d-43a4-8b33-9c6aa8691ead/volumes" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.862533 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher0215-account-delete-b6zvh"] Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.870766 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher0215-account-delete-b6zvh"] Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.936201 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-kt2vf"] Mar 14 09:35:15 crc kubenswrapper[4956]: E0314 09:35:15.936810 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-kuttl-api-log" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.936901 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-kuttl-api-log" Mar 14 09:35:15 crc kubenswrapper[4956]: E0314 09:35:15.936990 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63e9247-2a6e-4255-ac39-a1aa02803da8" containerName="watcher-applier" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937077 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63e9247-2a6e-4255-ac39-a1aa02803da8" containerName="watcher-applier" Mar 14 09:35:15 crc kubenswrapper[4956]: E0314 09:35:15.937184 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32b21a-7cb6-4432-8ba9-beb695c5423c" containerName="mariadb-account-delete" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937255 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32b21a-7cb6-4432-8ba9-beb695c5423c" containerName="mariadb-account-delete" Mar 14 09:35:15 crc kubenswrapper[4956]: E0314 09:35:15.937329 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" containerName="watcher-decision-engine" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937387 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" containerName="watcher-decision-engine" Mar 14 09:35:15 crc kubenswrapper[4956]: E0314 09:35:15.937452 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-api" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937522 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-api" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937715 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-kuttl-api-log" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937801 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32b21a-7cb6-4432-8ba9-beb695c5423c" containerName="mariadb-account-delete" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937885 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f47c104-0d0a-4883-b925-0472823393b7" containerName="watcher-api" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.937966 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c1f6f8-a23d-43a4-8b33-9c6aa8691ead" containerName="watcher-decision-engine" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.938047 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63e9247-2a6e-4255-ac39-a1aa02803da8" containerName="watcher-applier" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.938762 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:15 crc kubenswrapper[4956]: I0314 09:35:15.952164 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kt2vf"] Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.062614 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3add-account-create-update-mfq8n"] Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.064121 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.066445 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.072613 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3add-account-create-update-mfq8n"] Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.138245 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e365158-1936-463b-bfaa-a8efd23f8591-operator-scripts\") pod \"watcher-db-create-kt2vf\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.138384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ghjh\" (UniqueName: \"kubernetes.io/projected/7e365158-1936-463b-bfaa-a8efd23f8591-kube-api-access-7ghjh\") pod \"watcher-db-create-kt2vf\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.240026 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef22fc53-e58a-45b9-a255-8969c18c0667-operator-scripts\") pod \"watcher-3add-account-create-update-mfq8n\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.240108 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e365158-1936-463b-bfaa-a8efd23f8591-operator-scripts\") pod \"watcher-db-create-kt2vf\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.240215 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ghjh\" (UniqueName: \"kubernetes.io/projected/7e365158-1936-463b-bfaa-a8efd23f8591-kube-api-access-7ghjh\") pod \"watcher-db-create-kt2vf\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.240243 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vq7\" (UniqueName: \"kubernetes.io/projected/ef22fc53-e58a-45b9-a255-8969c18c0667-kube-api-access-h6vq7\") pod \"watcher-3add-account-create-update-mfq8n\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.240825 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e365158-1936-463b-bfaa-a8efd23f8591-operator-scripts\") pod \"watcher-db-create-kt2vf\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.258680 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ghjh\" (UniqueName: \"kubernetes.io/projected/7e365158-1936-463b-bfaa-a8efd23f8591-kube-api-access-7ghjh\") pod \"watcher-db-create-kt2vf\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.341248 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef22fc53-e58a-45b9-a255-8969c18c0667-operator-scripts\") pod \"watcher-3add-account-create-update-mfq8n\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.341472 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6vq7\" (UniqueName: \"kubernetes.io/projected/ef22fc53-e58a-45b9-a255-8969c18c0667-kube-api-access-h6vq7\") pod \"watcher-3add-account-create-update-mfq8n\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.342865 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef22fc53-e58a-45b9-a255-8969c18c0667-operator-scripts\") pod \"watcher-3add-account-create-update-mfq8n\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.362045 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6vq7\" (UniqueName: \"kubernetes.io/projected/ef22fc53-e58a-45b9-a255-8969c18c0667-kube-api-access-h6vq7\") pod \"watcher-3add-account-create-update-mfq8n\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.387187 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.554194 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.748774 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.142:3000/\": dial tcp 10.217.0.142:3000: connect: connection refused" Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.881679 4956 generic.go:334] "Generic (PLEG): container finished" podID="216f8c56-27d3-42da-9559-d58862c4f84a" containerID="aba7acd55c2c4b35fb09bbd85eaf66602b8293d09926544e38c03ab64a66c9e7" exitCode=0 Mar 14 09:35:16 crc kubenswrapper[4956]: I0314 09:35:16.881852 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerDied","Data":"aba7acd55c2c4b35fb09bbd85eaf66602b8293d09926544e38c03ab64a66c9e7"} Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.047461 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3add-account-create-update-mfq8n"] Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.087122 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kt2vf"] Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.169888 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.220066 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c32b21a-7cb6-4432-8ba9-beb695c5423c" path="/var/lib/kubelet/pods/5c32b21a-7cb6-4432-8ba9-beb695c5423c/volumes" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267142 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-log-httpd\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267194 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-scripts\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267223 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-ceilometer-tls-certs\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267250 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-combined-ca-bundle\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267298 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8cq5\" (UniqueName: \"kubernetes.io/projected/216f8c56-27d3-42da-9559-d58862c4f84a-kube-api-access-d8cq5\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267321 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-run-httpd\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267351 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-sg-core-conf-yaml\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.267398 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-config-data\") pod \"216f8c56-27d3-42da-9559-d58862c4f84a\" (UID: \"216f8c56-27d3-42da-9559-d58862c4f84a\") " Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.274975 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.278856 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.279577 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/216f8c56-27d3-42da-9559-d58862c4f84a-kube-api-access-d8cq5" (OuterVolumeSpecName: "kube-api-access-d8cq5") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "kube-api-access-d8cq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.282280 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-scripts" (OuterVolumeSpecName: "scripts") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.325913 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.369601 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.369634 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8cq5\" (UniqueName: \"kubernetes.io/projected/216f8c56-27d3-42da-9559-d58862c4f84a-kube-api-access-d8cq5\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.369648 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.369661 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.369671 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/216f8c56-27d3-42da-9559-d58862c4f84a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.402897 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.417636 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.475588 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.475628 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.494701 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-config-data" (OuterVolumeSpecName: "config-data") pod "216f8c56-27d3-42da-9559-d58862c4f84a" (UID: "216f8c56-27d3-42da-9559-d58862c4f84a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.576968 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216f8c56-27d3-42da-9559-d58862c4f84a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.894627 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"216f8c56-27d3-42da-9559-d58862c4f84a","Type":"ContainerDied","Data":"c890ef2d604b8552f9c50686fc70350b5e5ef4b34ea53f8eda4f516326bdef58"} Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.894708 4956 scope.go:117] "RemoveContainer" containerID="82b1c08c4310908e157d7b46968791121fb4a4f7244aba0bfe85914f96aeeed9" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.894645 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.896007 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kt2vf" event={"ID":"7e365158-1936-463b-bfaa-a8efd23f8591","Type":"ContainerStarted","Data":"ad78893c6fe976e2b6214171ec481a61342b2acfab97a066bc1395af18228014"} Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.896030 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kt2vf" event={"ID":"7e365158-1936-463b-bfaa-a8efd23f8591","Type":"ContainerStarted","Data":"865a8720b7ebb626f15114293a661b534eb03e356f27fecf3cf548757f474149"} Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.913473 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" event={"ID":"ef22fc53-e58a-45b9-a255-8969c18c0667","Type":"ContainerStarted","Data":"ff9b932a913ce2d3b11e3722e0544935f1ad11da47da4716cd4b0c8233c8b32b"} Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.913625 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" event={"ID":"ef22fc53-e58a-45b9-a255-8969c18c0667","Type":"ContainerStarted","Data":"58e43a240c4ec8f4a1b37bd7884257a7a0635d9cd7e7d0af6371bd4d41e0e6fe"} Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.920970 4956 scope.go:117] "RemoveContainer" containerID="09aeb4bfbce8f0d48ce429c9cc00a5331e7d4a59df31b198b658c3f4eaa4cd69" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.941921 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-kt2vf" podStartSLOduration=2.941905155 podStartE2EDuration="2.941905155s" podCreationTimestamp="2026-03-14 09:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:17.925830506 +0000 UTC m=+2323.438522774" watchObservedRunningTime="2026-03-14 09:35:17.941905155 +0000 UTC m=+2323.454597413" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.947377 4956 scope.go:117] "RemoveContainer" containerID="aba7acd55c2c4b35fb09bbd85eaf66602b8293d09926544e38c03ab64a66c9e7" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.956642 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" podStartSLOduration=1.95662413 podStartE2EDuration="1.95662413s" podCreationTimestamp="2026-03-14 09:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:17.940045558 +0000 UTC m=+2323.452737826" watchObservedRunningTime="2026-03-14 09:35:17.95662413 +0000 UTC m=+2323.469316398" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.963544 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.969209 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981133 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:17 crc kubenswrapper[4956]: E0314 09:35:17.981450 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="sg-core" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981466 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="sg-core" Mar 14 09:35:17 crc kubenswrapper[4956]: E0314 09:35:17.981507 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-notification-agent" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981514 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-notification-agent" Mar 14 09:35:17 crc kubenswrapper[4956]: E0314 09:35:17.981526 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-central-agent" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981532 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-central-agent" Mar 14 09:35:17 crc kubenswrapper[4956]: E0314 09:35:17.981546 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="proxy-httpd" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981552 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="proxy-httpd" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981697 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-central-agent" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981708 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="proxy-httpd" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981725 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="ceilometer-notification-agent" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.981733 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" containerName="sg-core" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.983173 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.985813 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.991373 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.991798 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.992106 4956 scope.go:117] "RemoveContainer" containerID="a0f8aea5278cb520a8596eae7eae982cd216c038cc00ecc5ddcd3e35fa2a3c86" Mar 14 09:35:17 crc kubenswrapper[4956]: I0314 09:35:17.993501 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085040 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085086 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085101 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085130 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-run-httpd\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085293 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-log-httpd\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085366 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-scripts\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085499 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-config-data\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.085627 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kml5r\" (UniqueName: \"kubernetes.io/projected/f09bee3a-b26b-436f-9462-302a9af35a11-kube-api-access-kml5r\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.186878 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.186928 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.186946 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.186972 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-run-httpd\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.186996 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-log-httpd\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.187024 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-scripts\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.187055 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-config-data\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.187081 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kml5r\" (UniqueName: \"kubernetes.io/projected/f09bee3a-b26b-436f-9462-302a9af35a11-kube-api-access-kml5r\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.187430 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-run-httpd\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.187567 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-log-httpd\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.191754 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.191994 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-scripts\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.192087 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.192208 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-config-data\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.192792 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.202974 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kml5r\" (UniqueName: \"kubernetes.io/projected/f09bee3a-b26b-436f-9462-302a9af35a11-kube-api-access-kml5r\") pod \"ceilometer-0\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.209413 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:35:18 crc kubenswrapper[4956]: E0314 09:35:18.209724 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.301022 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.721573 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:18 crc kubenswrapper[4956]: W0314 09:35:18.722175 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf09bee3a_b26b_436f_9462_302a9af35a11.slice/crio-8f7db46a3778a7f5eb1c414cf29736835eb2222ea34bcbcf916b6600e8b3847a WatchSource:0}: Error finding container 8f7db46a3778a7f5eb1c414cf29736835eb2222ea34bcbcf916b6600e8b3847a: Status 404 returned error can't find the container with id 8f7db46a3778a7f5eb1c414cf29736835eb2222ea34bcbcf916b6600e8b3847a Mar 14 09:35:18 crc kubenswrapper[4956]: I0314 09:35:18.920757 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerStarted","Data":"8f7db46a3778a7f5eb1c414cf29736835eb2222ea34bcbcf916b6600e8b3847a"} Mar 14 09:35:19 crc kubenswrapper[4956]: I0314 09:35:19.224528 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="216f8c56-27d3-42da-9559-d58862c4f84a" path="/var/lib/kubelet/pods/216f8c56-27d3-42da-9559-d58862c4f84a/volumes" Mar 14 09:35:19 crc kubenswrapper[4956]: I0314 09:35:19.930441 4956 generic.go:334] "Generic (PLEG): container finished" podID="ef22fc53-e58a-45b9-a255-8969c18c0667" containerID="ff9b932a913ce2d3b11e3722e0544935f1ad11da47da4716cd4b0c8233c8b32b" exitCode=0 Mar 14 09:35:19 crc kubenswrapper[4956]: I0314 09:35:19.930527 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" event={"ID":"ef22fc53-e58a-45b9-a255-8969c18c0667","Type":"ContainerDied","Data":"ff9b932a913ce2d3b11e3722e0544935f1ad11da47da4716cd4b0c8233c8b32b"} Mar 14 09:35:19 crc kubenswrapper[4956]: I0314 09:35:19.934555 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerStarted","Data":"0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a"} Mar 14 09:35:19 crc kubenswrapper[4956]: I0314 09:35:19.935928 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e365158-1936-463b-bfaa-a8efd23f8591" containerID="ad78893c6fe976e2b6214171ec481a61342b2acfab97a066bc1395af18228014" exitCode=0 Mar 14 09:35:19 crc kubenswrapper[4956]: I0314 09:35:19.935963 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kt2vf" event={"ID":"7e365158-1936-463b-bfaa-a8efd23f8591","Type":"ContainerDied","Data":"ad78893c6fe976e2b6214171ec481a61342b2acfab97a066bc1395af18228014"} Mar 14 09:35:20 crc kubenswrapper[4956]: I0314 09:35:20.945695 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerStarted","Data":"81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8"} Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.351293 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.354794 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.438153 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef22fc53-e58a-45b9-a255-8969c18c0667-operator-scripts\") pod \"ef22fc53-e58a-45b9-a255-8969c18c0667\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.438227 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6vq7\" (UniqueName: \"kubernetes.io/projected/ef22fc53-e58a-45b9-a255-8969c18c0667-kube-api-access-h6vq7\") pod \"ef22fc53-e58a-45b9-a255-8969c18c0667\" (UID: \"ef22fc53-e58a-45b9-a255-8969c18c0667\") " Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.438294 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ghjh\" (UniqueName: \"kubernetes.io/projected/7e365158-1936-463b-bfaa-a8efd23f8591-kube-api-access-7ghjh\") pod \"7e365158-1936-463b-bfaa-a8efd23f8591\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.438314 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e365158-1936-463b-bfaa-a8efd23f8591-operator-scripts\") pod \"7e365158-1936-463b-bfaa-a8efd23f8591\" (UID: \"7e365158-1936-463b-bfaa-a8efd23f8591\") " Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.439222 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e365158-1936-463b-bfaa-a8efd23f8591-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e365158-1936-463b-bfaa-a8efd23f8591" (UID: "7e365158-1936-463b-bfaa-a8efd23f8591"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.439291 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef22fc53-e58a-45b9-a255-8969c18c0667-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef22fc53-e58a-45b9-a255-8969c18c0667" (UID: "ef22fc53-e58a-45b9-a255-8969c18c0667"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.441901 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e365158-1936-463b-bfaa-a8efd23f8591-kube-api-access-7ghjh" (OuterVolumeSpecName: "kube-api-access-7ghjh") pod "7e365158-1936-463b-bfaa-a8efd23f8591" (UID: "7e365158-1936-463b-bfaa-a8efd23f8591"). InnerVolumeSpecName "kube-api-access-7ghjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.442204 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef22fc53-e58a-45b9-a255-8969c18c0667-kube-api-access-h6vq7" (OuterVolumeSpecName: "kube-api-access-h6vq7") pod "ef22fc53-e58a-45b9-a255-8969c18c0667" (UID: "ef22fc53-e58a-45b9-a255-8969c18c0667"). InnerVolumeSpecName "kube-api-access-h6vq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.539667 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef22fc53-e58a-45b9-a255-8969c18c0667-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.539959 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6vq7\" (UniqueName: \"kubernetes.io/projected/ef22fc53-e58a-45b9-a255-8969c18c0667-kube-api-access-h6vq7\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.540046 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ghjh\" (UniqueName: \"kubernetes.io/projected/7e365158-1936-463b-bfaa-a8efd23f8591-kube-api-access-7ghjh\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.540133 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e365158-1936-463b-bfaa-a8efd23f8591-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.958760 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerStarted","Data":"884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28"} Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.962103 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-kt2vf" event={"ID":"7e365158-1936-463b-bfaa-a8efd23f8591","Type":"ContainerDied","Data":"865a8720b7ebb626f15114293a661b534eb03e356f27fecf3cf548757f474149"} Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.962137 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-kt2vf" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.962150 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865a8720b7ebb626f15114293a661b534eb03e356f27fecf3cf548757f474149" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.964089 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" event={"ID":"ef22fc53-e58a-45b9-a255-8969c18c0667","Type":"ContainerDied","Data":"58e43a240c4ec8f4a1b37bd7884257a7a0635d9cd7e7d0af6371bd4d41e0e6fe"} Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.964118 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e43a240c4ec8f4a1b37bd7884257a7a0635d9cd7e7d0af6371bd4d41e0e6fe" Mar 14 09:35:21 crc kubenswrapper[4956]: I0314 09:35:21.964160 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3add-account-create-update-mfq8n" Mar 14 09:35:23 crc kubenswrapper[4956]: I0314 09:35:23.980535 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerStarted","Data":"8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0"} Mar 14 09:35:23 crc kubenswrapper[4956]: I0314 09:35:23.981966 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:24 crc kubenswrapper[4956]: I0314 09:35:24.003876 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.589587433 podStartE2EDuration="7.003862084s" podCreationTimestamp="2026-03-14 09:35:17 +0000 UTC" firstStartedPulling="2026-03-14 09:35:18.724885499 +0000 UTC m=+2324.237577767" lastFinishedPulling="2026-03-14 09:35:23.13916016 +0000 UTC m=+2328.651852418" observedRunningTime="2026-03-14 09:35:24.001813092 +0000 UTC m=+2329.514505360" watchObservedRunningTime="2026-03-14 09:35:24.003862084 +0000 UTC m=+2329.516554352" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.494366 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx"] Mar 14 09:35:26 crc kubenswrapper[4956]: E0314 09:35:26.495102 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef22fc53-e58a-45b9-a255-8969c18c0667" containerName="mariadb-account-create-update" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.495121 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef22fc53-e58a-45b9-a255-8969c18c0667" containerName="mariadb-account-create-update" Mar 14 09:35:26 crc kubenswrapper[4956]: E0314 09:35:26.495135 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e365158-1936-463b-bfaa-a8efd23f8591" containerName="mariadb-database-create" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.495142 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e365158-1936-463b-bfaa-a8efd23f8591" containerName="mariadb-database-create" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.495341 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef22fc53-e58a-45b9-a255-8969c18c0667" containerName="mariadb-account-create-update" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.495366 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e365158-1936-463b-bfaa-a8efd23f8591" containerName="mariadb-database-create" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.496267 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.498584 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.498844 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-n8k9w" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.507291 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx"] Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.514385 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-db-sync-config-data\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.514495 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzkkj\" (UniqueName: \"kubernetes.io/projected/fc244e02-efe3-498b-807f-f0e2b00e934f-kube-api-access-kzkkj\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.514536 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.514578 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-config-data\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.616281 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.616343 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-config-data\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.616384 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-db-sync-config-data\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.616455 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzkkj\" (UniqueName: \"kubernetes.io/projected/fc244e02-efe3-498b-807f-f0e2b00e934f-kube-api-access-kzkkj\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.622545 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-config-data\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.623074 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.628805 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-db-sync-config-data\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.644033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzkkj\" (UniqueName: \"kubernetes.io/projected/fc244e02-efe3-498b-807f-f0e2b00e934f-kube-api-access-kzkkj\") pod \"watcher-kuttl-db-sync-hs2sx\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:26 crc kubenswrapper[4956]: I0314 09:35:26.814505 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:27 crc kubenswrapper[4956]: I0314 09:35:27.318339 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx"] Mar 14 09:35:28 crc kubenswrapper[4956]: I0314 09:35:28.019846 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" event={"ID":"fc244e02-efe3-498b-807f-f0e2b00e934f","Type":"ContainerStarted","Data":"0cb602bcda56debdcb4f862f2c5a27238fa03d5f9a69c722685b1a2ecf47fe87"} Mar 14 09:35:28 crc kubenswrapper[4956]: I0314 09:35:28.020174 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" event={"ID":"fc244e02-efe3-498b-807f-f0e2b00e934f","Type":"ContainerStarted","Data":"53853604174c34e570226cca3953e6f4e0c27e71b903f9209881c415305a3841"} Mar 14 09:35:28 crc kubenswrapper[4956]: I0314 09:35:28.035796 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" podStartSLOduration=2.035775265 podStartE2EDuration="2.035775265s" podCreationTimestamp="2026-03-14 09:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:28.03441252 +0000 UTC m=+2333.547104788" watchObservedRunningTime="2026-03-14 09:35:28.035775265 +0000 UTC m=+2333.548467533" Mar 14 09:35:30 crc kubenswrapper[4956]: I0314 09:35:30.037693 4956 generic.go:334] "Generic (PLEG): container finished" podID="fc244e02-efe3-498b-807f-f0e2b00e934f" containerID="0cb602bcda56debdcb4f862f2c5a27238fa03d5f9a69c722685b1a2ecf47fe87" exitCode=0 Mar 14 09:35:30 crc kubenswrapper[4956]: I0314 09:35:30.037755 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" event={"ID":"fc244e02-efe3-498b-807f-f0e2b00e934f","Type":"ContainerDied","Data":"0cb602bcda56debdcb4f862f2c5a27238fa03d5f9a69c722685b1a2ecf47fe87"} Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.357046 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.392620 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-db-sync-config-data\") pod \"fc244e02-efe3-498b-807f-f0e2b00e934f\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.392709 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzkkj\" (UniqueName: \"kubernetes.io/projected/fc244e02-efe3-498b-807f-f0e2b00e934f-kube-api-access-kzkkj\") pod \"fc244e02-efe3-498b-807f-f0e2b00e934f\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.392755 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-combined-ca-bundle\") pod \"fc244e02-efe3-498b-807f-f0e2b00e934f\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.392807 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-config-data\") pod \"fc244e02-efe3-498b-807f-f0e2b00e934f\" (UID: \"fc244e02-efe3-498b-807f-f0e2b00e934f\") " Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.398162 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fc244e02-efe3-498b-807f-f0e2b00e934f" (UID: "fc244e02-efe3-498b-807f-f0e2b00e934f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.398459 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc244e02-efe3-498b-807f-f0e2b00e934f-kube-api-access-kzkkj" (OuterVolumeSpecName: "kube-api-access-kzkkj") pod "fc244e02-efe3-498b-807f-f0e2b00e934f" (UID: "fc244e02-efe3-498b-807f-f0e2b00e934f"). InnerVolumeSpecName "kube-api-access-kzkkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.420336 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc244e02-efe3-498b-807f-f0e2b00e934f" (UID: "fc244e02-efe3-498b-807f-f0e2b00e934f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.436415 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-config-data" (OuterVolumeSpecName: "config-data") pod "fc244e02-efe3-498b-807f-f0e2b00e934f" (UID: "fc244e02-efe3-498b-807f-f0e2b00e934f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.495224 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.495256 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.495267 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc244e02-efe3-498b-807f-f0e2b00e934f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:31 crc kubenswrapper[4956]: I0314 09:35:31.495279 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzkkj\" (UniqueName: \"kubernetes.io/projected/fc244e02-efe3-498b-807f-f0e2b00e934f-kube-api-access-kzkkj\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.052590 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" event={"ID":"fc244e02-efe3-498b-807f-f0e2b00e934f","Type":"ContainerDied","Data":"53853604174c34e570226cca3953e6f4e0c27e71b903f9209881c415305a3841"} Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.052856 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53853604174c34e570226cca3953e6f4e0c27e71b903f9209881c415305a3841" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.052655 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.209460 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:35:32 crc kubenswrapper[4956]: E0314 09:35:32.209971 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.264367 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:32 crc kubenswrapper[4956]: E0314 09:35:32.264707 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc244e02-efe3-498b-807f-f0e2b00e934f" containerName="watcher-kuttl-db-sync" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.264726 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc244e02-efe3-498b-807f-f0e2b00e934f" containerName="watcher-kuttl-db-sync" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.264887 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc244e02-efe3-498b-807f-f0e2b00e934f" containerName="watcher-kuttl-db-sync" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.265796 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.267453 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-n8k9w" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.268765 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.276632 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.308521 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.308610 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.308874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ddtn\" (UniqueName: \"kubernetes.io/projected/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-kube-api-access-9ddtn\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.309112 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-logs\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.309152 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.326293 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.327340 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.330495 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.336615 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.374881 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.376805 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.380438 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.386758 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410498 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410547 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ddtn\" (UniqueName: \"kubernetes.io/projected/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-kube-api-access-9ddtn\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410656 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g548h\" (UniqueName: \"kubernetes.io/projected/8fe2a92d-ff95-4ead-b54e-790a69b16631-kube-api-access-g548h\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410684 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410712 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410748 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-logs\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410773 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8prh\" (UniqueName: \"kubernetes.io/projected/7efd3e75-902d-40cc-bdfb-4158145f7c49-kube-api-access-j8prh\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410791 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410810 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd3e75-902d-40cc-bdfb-4158145f7c49-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410838 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe2a92d-ff95-4ead-b54e-790a69b16631-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410857 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410893 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.410913 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.411845 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-logs\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.416120 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.419688 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.429007 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.430092 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ddtn\" (UniqueName: \"kubernetes.io/projected/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-kube-api-access-9ddtn\") pod \"watcher-kuttl-api-0\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.512537 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g548h\" (UniqueName: \"kubernetes.io/projected/8fe2a92d-ff95-4ead-b54e-790a69b16631-kube-api-access-g548h\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.512776 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.512878 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.512981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8prh\" (UniqueName: \"kubernetes.io/projected/7efd3e75-902d-40cc-bdfb-4158145f7c49-kube-api-access-j8prh\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513060 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd3e75-902d-40cc-bdfb-4158145f7c49-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513165 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe2a92d-ff95-4ead-b54e-790a69b16631-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513271 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513366 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513725 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe2a92d-ff95-4ead-b54e-790a69b16631-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.513585 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd3e75-902d-40cc-bdfb-4158145f7c49-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.516500 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.516699 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.516817 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.518545 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.519604 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.528311 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8prh\" (UniqueName: \"kubernetes.io/projected/7efd3e75-902d-40cc-bdfb-4158145f7c49-kube-api-access-j8prh\") pod \"watcher-kuttl-applier-0\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.530193 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g548h\" (UniqueName: \"kubernetes.io/projected/8fe2a92d-ff95-4ead-b54e-790a69b16631-kube-api-access-g548h\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.582794 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.644017 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:32 crc kubenswrapper[4956]: I0314 09:35:32.703694 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:33 crc kubenswrapper[4956]: I0314 09:35:33.004105 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:33 crc kubenswrapper[4956]: I0314 09:35:33.060668 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39","Type":"ContainerStarted","Data":"e61415638911ee61761e1755d70626d5fb45d6245eb47f5b0902a4b880c8f639"} Mar 14 09:35:33 crc kubenswrapper[4956]: I0314 09:35:33.108431 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:33 crc kubenswrapper[4956]: W0314 09:35:33.110465 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7efd3e75_902d_40cc_bdfb_4158145f7c49.slice/crio-0feeaa3a7016e195e46259b66f27b524d236418e8bd4cd2f877f228fba4e104b WatchSource:0}: Error finding container 0feeaa3a7016e195e46259b66f27b524d236418e8bd4cd2f877f228fba4e104b: Status 404 returned error can't find the container with id 0feeaa3a7016e195e46259b66f27b524d236418e8bd4cd2f877f228fba4e104b Mar 14 09:35:33 crc kubenswrapper[4956]: I0314 09:35:33.182799 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.070522 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39","Type":"ContainerStarted","Data":"40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356"} Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.071649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39","Type":"ContainerStarted","Data":"2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b"} Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.071730 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.072420 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"7efd3e75-902d-40cc-bdfb-4158145f7c49","Type":"ContainerStarted","Data":"4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b"} Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.072540 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"7efd3e75-902d-40cc-bdfb-4158145f7c49","Type":"ContainerStarted","Data":"0feeaa3a7016e195e46259b66f27b524d236418e8bd4cd2f877f228fba4e104b"} Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.073596 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8fe2a92d-ff95-4ead-b54e-790a69b16631","Type":"ContainerStarted","Data":"487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69"} Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.073631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8fe2a92d-ff95-4ead-b54e-790a69b16631","Type":"ContainerStarted","Data":"18beac058aef1cc94d9a4c5227d795104d5d9007f362517cd2e6804c454f90ea"} Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.097157 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.097135177 podStartE2EDuration="2.097135177s" podCreationTimestamp="2026-03-14 09:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:34.092495829 +0000 UTC m=+2339.605188097" watchObservedRunningTime="2026-03-14 09:35:34.097135177 +0000 UTC m=+2339.609827445" Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.115691 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.115670019 podStartE2EDuration="2.115670019s" podCreationTimestamp="2026-03-14 09:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:34.108932917 +0000 UTC m=+2339.621625195" watchObservedRunningTime="2026-03-14 09:35:34.115670019 +0000 UTC m=+2339.628362287" Mar 14 09:35:34 crc kubenswrapper[4956]: I0314 09:35:34.130332 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.130311921 podStartE2EDuration="2.130311921s" podCreationTimestamp="2026-03-14 09:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:34.12593135 +0000 UTC m=+2339.638623618" watchObservedRunningTime="2026-03-14 09:35:34.130311921 +0000 UTC m=+2339.643004189" Mar 14 09:35:36 crc kubenswrapper[4956]: I0314 09:35:36.401671 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:37 crc kubenswrapper[4956]: I0314 09:35:37.583067 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:37 crc kubenswrapper[4956]: I0314 09:35:37.645116 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:42 crc kubenswrapper[4956]: I0314 09:35:42.583157 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:42 crc kubenswrapper[4956]: I0314 09:35:42.589108 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:42 crc kubenswrapper[4956]: I0314 09:35:42.644440 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:42 crc kubenswrapper[4956]: I0314 09:35:42.666220 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:42 crc kubenswrapper[4956]: I0314 09:35:42.704343 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:42 crc kubenswrapper[4956]: I0314 09:35:42.724848 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:43 crc kubenswrapper[4956]: I0314 09:35:43.261313 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:43 crc kubenswrapper[4956]: I0314 09:35:43.266585 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:43 crc kubenswrapper[4956]: I0314 09:35:43.298135 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:43 crc kubenswrapper[4956]: I0314 09:35:43.298588 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:44 crc kubenswrapper[4956]: I0314 09:35:44.387085 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:44 crc kubenswrapper[4956]: I0314 09:35:44.387676 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-central-agent" containerID="cri-o://0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a" gracePeriod=30 Mar 14 09:35:44 crc kubenswrapper[4956]: I0314 09:35:44.387733 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="sg-core" containerID="cri-o://884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28" gracePeriod=30 Mar 14 09:35:44 crc kubenswrapper[4956]: I0314 09:35:44.387789 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-notification-agent" containerID="cri-o://81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8" gracePeriod=30 Mar 14 09:35:44 crc kubenswrapper[4956]: I0314 09:35:44.387841 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="proxy-httpd" containerID="cri-o://8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0" gracePeriod=30 Mar 14 09:35:44 crc kubenswrapper[4956]: I0314 09:35:44.425611 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.283050 4956 generic.go:334] "Generic (PLEG): container finished" podID="f09bee3a-b26b-436f-9462-302a9af35a11" containerID="8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0" exitCode=0 Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.283084 4956 generic.go:334] "Generic (PLEG): container finished" podID="f09bee3a-b26b-436f-9462-302a9af35a11" containerID="884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28" exitCode=2 Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.283092 4956 generic.go:334] "Generic (PLEG): container finished" podID="f09bee3a-b26b-436f-9462-302a9af35a11" containerID="0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a" exitCode=0 Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.284107 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerDied","Data":"8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0"} Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.284148 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerDied","Data":"884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28"} Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.284162 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerDied","Data":"0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a"} Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.538048 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.546729 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-hs2sx"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.580156 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher3add-account-delete-dfmh4"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.581123 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.596103 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3add-account-delete-dfmh4"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.650404 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.654693 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="7efd3e75-902d-40cc-bdfb-4158145f7c49" containerName="watcher-applier" containerID="cri-o://4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b" gracePeriod=30 Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.687496 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.692609 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-api" containerID="cri-o://40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356" gracePeriod=30 Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.688024 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-kuttl-api-log" containerID="cri-o://2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b" gracePeriod=30 Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.698968 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jsr\" (UniqueName: \"kubernetes.io/projected/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-kube-api-access-p8jsr\") pod \"watcher3add-account-delete-dfmh4\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.699105 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-operator-scripts\") pod \"watcher3add-account-delete-dfmh4\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.730569 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.801062 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jsr\" (UniqueName: \"kubernetes.io/projected/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-kube-api-access-p8jsr\") pod \"watcher3add-account-delete-dfmh4\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.801200 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-operator-scripts\") pod \"watcher3add-account-delete-dfmh4\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.802055 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-operator-scripts\") pod \"watcher3add-account-delete-dfmh4\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.838927 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jsr\" (UniqueName: \"kubernetes.io/projected/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-kube-api-access-p8jsr\") pod \"watcher3add-account-delete-dfmh4\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:45 crc kubenswrapper[4956]: I0314 09:35:45.895354 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.209509 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:35:46 crc kubenswrapper[4956]: E0314 09:35:46.210080 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.294778 4956 generic.go:334] "Generic (PLEG): container finished" podID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerID="2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b" exitCode=143 Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.294883 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39","Type":"ContainerDied","Data":"2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b"} Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.294959 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="8fe2a92d-ff95-4ead-b54e-790a69b16631" containerName="watcher-decision-engine" containerID="cri-o://487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69" gracePeriod=30 Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.481169 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3add-account-delete-dfmh4"] Mar 14 09:35:46 crc kubenswrapper[4956]: W0314 09:35:46.490378 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44081b62_9e6e_40f1_8e17_4a4f42aa8cad.slice/crio-7ac8234cd5e2ac4b449ac8bce9db376c2459ec62868a0094207188cb1256f01d WatchSource:0}: Error finding container 7ac8234cd5e2ac4b449ac8bce9db376c2459ec62868a0094207188cb1256f01d: Status 404 returned error can't find the container with id 7ac8234cd5e2ac4b449ac8bce9db376c2459ec62868a0094207188cb1256f01d Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.688474 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819016 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-config-data\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819092 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-run-httpd\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819118 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-scripts\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819222 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-sg-core-conf-yaml\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819240 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-log-httpd\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819334 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-combined-ca-bundle\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819358 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-ceilometer-tls-certs\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.819404 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kml5r\" (UniqueName: \"kubernetes.io/projected/f09bee3a-b26b-436f-9462-302a9af35a11-kube-api-access-kml5r\") pod \"f09bee3a-b26b-436f-9462-302a9af35a11\" (UID: \"f09bee3a-b26b-436f-9462-302a9af35a11\") " Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.820278 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.820451 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.826450 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09bee3a-b26b-436f-9462-302a9af35a11-kube-api-access-kml5r" (OuterVolumeSpecName: "kube-api-access-kml5r") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "kube-api-access-kml5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.830535 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-scripts" (OuterVolumeSpecName: "scripts") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.864539 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.896358 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.905693 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921101 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921140 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921153 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921165 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921178 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kml5r\" (UniqueName: \"kubernetes.io/projected/f09bee3a-b26b-436f-9462-302a9af35a11-kube-api-access-kml5r\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921190 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f09bee3a-b26b-436f-9462-302a9af35a11-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.921200 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[4956]: I0314 09:35:46.928105 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-config-data" (OuterVolumeSpecName: "config-data") pod "f09bee3a-b26b-436f-9462-302a9af35a11" (UID: "f09bee3a-b26b-436f-9462-302a9af35a11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.022942 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09bee3a-b26b-436f-9462-302a9af35a11-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.067836 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.219158 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc244e02-efe3-498b-807f-f0e2b00e934f" path="/var/lib/kubelet/pods/fc244e02-efe3-498b-807f-f0e2b00e934f/volumes" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.224684 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-logs\") pod \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.224831 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-combined-ca-bundle\") pod \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.224859 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-custom-prometheus-ca\") pod \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.224974 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ddtn\" (UniqueName: \"kubernetes.io/projected/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-kube-api-access-9ddtn\") pod \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.225017 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-config-data\") pod \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\" (UID: \"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39\") " Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.225044 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-logs" (OuterVolumeSpecName: "logs") pod "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" (UID: "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.225367 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.233747 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-kube-api-access-9ddtn" (OuterVolumeSpecName: "kube-api-access-9ddtn") pod "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" (UID: "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39"). InnerVolumeSpecName "kube-api-access-9ddtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.263513 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" (UID: "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.273499 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" (UID: "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.287743 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-config-data" (OuterVolumeSpecName: "config-data") pod "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" (UID: "2905e6b3-3aa6-47f5-ad12-dd3bd6748e39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.302719 4956 generic.go:334] "Generic (PLEG): container finished" podID="44081b62-9e6e-40f1-8e17-4a4f42aa8cad" containerID="cedd2ef91032e570c2da4cac0c600ac8256b8f8cdaf02cc5f35e4b95cbdda7e9" exitCode=0 Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.302775 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" event={"ID":"44081b62-9e6e-40f1-8e17-4a4f42aa8cad","Type":"ContainerDied","Data":"cedd2ef91032e570c2da4cac0c600ac8256b8f8cdaf02cc5f35e4b95cbdda7e9"} Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.302799 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" event={"ID":"44081b62-9e6e-40f1-8e17-4a4f42aa8cad","Type":"ContainerStarted","Data":"7ac8234cd5e2ac4b449ac8bce9db376c2459ec62868a0094207188cb1256f01d"} Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.306011 4956 generic.go:334] "Generic (PLEG): container finished" podID="f09bee3a-b26b-436f-9462-302a9af35a11" containerID="81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8" exitCode=0 Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.306070 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerDied","Data":"81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8"} Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.306111 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.306137 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f09bee3a-b26b-436f-9462-302a9af35a11","Type":"ContainerDied","Data":"8f7db46a3778a7f5eb1c414cf29736835eb2222ea34bcbcf916b6600e8b3847a"} Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.306155 4956 scope.go:117] "RemoveContainer" containerID="8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.309268 4956 generic.go:334] "Generic (PLEG): container finished" podID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerID="40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356" exitCode=0 Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.309361 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39","Type":"ContainerDied","Data":"40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356"} Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.309395 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"2905e6b3-3aa6-47f5-ad12-dd3bd6748e39","Type":"ContainerDied","Data":"e61415638911ee61761e1755d70626d5fb45d6245eb47f5b0902a4b880c8f639"} Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.309444 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.326919 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.327268 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.327284 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ddtn\" (UniqueName: \"kubernetes.io/projected/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-kube-api-access-9ddtn\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.327294 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.336356 4956 scope.go:117] "RemoveContainer" containerID="884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.344510 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.350632 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.368226 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.369308 4956 scope.go:117] "RemoveContainer" containerID="81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.382570 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.392962 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.393299 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-notification-agent" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393314 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-notification-agent" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.393328 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-kuttl-api-log" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393334 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-kuttl-api-log" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.393347 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="proxy-httpd" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393353 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="proxy-httpd" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.393364 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-central-agent" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393370 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-central-agent" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.393385 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="sg-core" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393391 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="sg-core" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.393399 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-api" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393404 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-api" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393542 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="proxy-httpd" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393556 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-central-agent" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393575 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="ceilometer-notification-agent" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393621 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-kuttl-api-log" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393632 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" containerName="watcher-api" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.393639 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" containerName="sg-core" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.400563 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.404934 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.405745 4956 scope.go:117] "RemoveContainer" containerID="0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.406051 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.406228 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.406637 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.429531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.429925 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvbf\" (UniqueName: \"kubernetes.io/projected/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-kube-api-access-hxvbf\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.430129 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-config-data\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.430301 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.430400 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.430581 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-run-httpd\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.430704 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-log-httpd\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.430805 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-scripts\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.437609 4956 scope.go:117] "RemoveContainer" containerID="8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.438117 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0\": container with ID starting with 8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0 not found: ID does not exist" containerID="8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.438209 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0"} err="failed to get container status \"8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0\": rpc error: code = NotFound desc = could not find container \"8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0\": container with ID starting with 8e6cb2d13ec645463b67293e6295aef8bf24e29ce9860b25d9c00e20a66670b0 not found: ID does not exist" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.438284 4956 scope.go:117] "RemoveContainer" containerID="884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.438610 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28\": container with ID starting with 884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28 not found: ID does not exist" containerID="884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.438690 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28"} err="failed to get container status \"884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28\": rpc error: code = NotFound desc = could not find container \"884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28\": container with ID starting with 884318b582236e114968a416b87068a9d76a5033d3033e3d6df9e339b6671e28 not found: ID does not exist" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.438771 4956 scope.go:117] "RemoveContainer" containerID="81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.439130 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8\": container with ID starting with 81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8 not found: ID does not exist" containerID="81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.439172 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8"} err="failed to get container status \"81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8\": rpc error: code = NotFound desc = could not find container \"81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8\": container with ID starting with 81d82fead6ec471c537d49007676a43aa06e211d36e5d38d2cc9a2849c327dd8 not found: ID does not exist" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.439203 4956 scope.go:117] "RemoveContainer" containerID="0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.439584 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a\": container with ID starting with 0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a not found: ID does not exist" containerID="0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.439606 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a"} err="failed to get container status \"0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a\": rpc error: code = NotFound desc = could not find container \"0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a\": container with ID starting with 0c4ee583b6d05094a6751829c08a8a2cf5d93efaafb447cf19b061c2cdb2653a not found: ID does not exist" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.439632 4956 scope.go:117] "RemoveContainer" containerID="40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.459900 4956 scope.go:117] "RemoveContainer" containerID="2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.488783 4956 scope.go:117] "RemoveContainer" containerID="40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.489270 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356\": container with ID starting with 40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356 not found: ID does not exist" containerID="40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.489309 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356"} err="failed to get container status \"40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356\": rpc error: code = NotFound desc = could not find container \"40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356\": container with ID starting with 40677c771a6eef27f42eb8a3b34805ad9728de6d1b1242a9719d34fec2af1356 not found: ID does not exist" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.489333 4956 scope.go:117] "RemoveContainer" containerID="2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.489676 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b\": container with ID starting with 2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b not found: ID does not exist" containerID="2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.489702 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b"} err="failed to get container status \"2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b\": rpc error: code = NotFound desc = could not find container \"2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b\": container with ID starting with 2fa3155851bea2048a9f4dbde96c47c167dad16379dc5e6c7456ff264906555b not found: ID does not exist" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.531579 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-run-httpd\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.531833 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-log-httpd\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.531937 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-scripts\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532090 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532225 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvbf\" (UniqueName: \"kubernetes.io/projected/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-kube-api-access-hxvbf\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532326 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-log-httpd\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532120 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-run-httpd\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532550 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-config-data\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532672 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.532769 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.536633 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.537022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-config-data\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.537099 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-scripts\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.537228 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.541083 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.549506 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvbf\" (UniqueName: \"kubernetes.io/projected/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-kube-api-access-hxvbf\") pod \"ceilometer-0\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.646250 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.647704 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.648638 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:35:47 crc kubenswrapper[4956]: E0314 09:35:47.648679 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="7efd3e75-902d-40cc-bdfb-4158145f7c49" containerName="watcher-applier" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.729772 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:47 crc kubenswrapper[4956]: I0314 09:35:47.839981 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.232090 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:48 crc kubenswrapper[4956]: W0314 09:35:48.238682 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b21ba4_2690_4688_b3df_2fe8d53e36aa.slice/crio-68662914d79b19c0ac826ea8e60a3eadffab1c566035d388612a31093029712c WatchSource:0}: Error finding container 68662914d79b19c0ac826ea8e60a3eadffab1c566035d388612a31093029712c: Status 404 returned error can't find the container with id 68662914d79b19c0ac826ea8e60a3eadffab1c566035d388612a31093029712c Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.321721 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerStarted","Data":"68662914d79b19c0ac826ea8e60a3eadffab1c566035d388612a31093029712c"} Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.643380 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.753718 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-operator-scripts\") pod \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.753829 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8jsr\" (UniqueName: \"kubernetes.io/projected/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-kube-api-access-p8jsr\") pod \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\" (UID: \"44081b62-9e6e-40f1-8e17-4a4f42aa8cad\") " Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.754667 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44081b62-9e6e-40f1-8e17-4a4f42aa8cad" (UID: "44081b62-9e6e-40f1-8e17-4a4f42aa8cad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.756990 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-kube-api-access-p8jsr" (OuterVolumeSpecName: "kube-api-access-p8jsr") pod "44081b62-9e6e-40f1-8e17-4a4f42aa8cad" (UID: "44081b62-9e6e-40f1-8e17-4a4f42aa8cad"). InnerVolumeSpecName "kube-api-access-p8jsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.855379 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:48 crc kubenswrapper[4956]: I0314 09:35:48.855752 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8jsr\" (UniqueName: \"kubernetes.io/projected/44081b62-9e6e-40f1-8e17-4a4f42aa8cad-kube-api-access-p8jsr\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.221122 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2905e6b3-3aa6-47f5-ad12-dd3bd6748e39" path="/var/lib/kubelet/pods/2905e6b3-3aa6-47f5-ad12-dd3bd6748e39/volumes" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.221960 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f09bee3a-b26b-436f-9462-302a9af35a11" path="/var/lib/kubelet/pods/f09bee3a-b26b-436f-9462-302a9af35a11/volumes" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.332239 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.332234 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher3add-account-delete-dfmh4" event={"ID":"44081b62-9e6e-40f1-8e17-4a4f42aa8cad","Type":"ContainerDied","Data":"7ac8234cd5e2ac4b449ac8bce9db376c2459ec62868a0094207188cb1256f01d"} Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.332362 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac8234cd5e2ac4b449ac8bce9db376c2459ec62868a0094207188cb1256f01d" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.333971 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerStarted","Data":"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1"} Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.336468 4956 generic.go:334] "Generic (PLEG): container finished" podID="7efd3e75-902d-40cc-bdfb-4158145f7c49" containerID="4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b" exitCode=0 Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.336557 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"7efd3e75-902d-40cc-bdfb-4158145f7c49","Type":"ContainerDied","Data":"4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b"} Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.444195 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.464029 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-combined-ca-bundle\") pod \"7efd3e75-902d-40cc-bdfb-4158145f7c49\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.464203 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-config-data\") pod \"7efd3e75-902d-40cc-bdfb-4158145f7c49\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.464298 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd3e75-902d-40cc-bdfb-4158145f7c49-logs\") pod \"7efd3e75-902d-40cc-bdfb-4158145f7c49\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.464350 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8prh\" (UniqueName: \"kubernetes.io/projected/7efd3e75-902d-40cc-bdfb-4158145f7c49-kube-api-access-j8prh\") pod \"7efd3e75-902d-40cc-bdfb-4158145f7c49\" (UID: \"7efd3e75-902d-40cc-bdfb-4158145f7c49\") " Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.464687 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7efd3e75-902d-40cc-bdfb-4158145f7c49-logs" (OuterVolumeSpecName: "logs") pod "7efd3e75-902d-40cc-bdfb-4158145f7c49" (UID: "7efd3e75-902d-40cc-bdfb-4158145f7c49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.464799 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7efd3e75-902d-40cc-bdfb-4158145f7c49-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.468761 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7efd3e75-902d-40cc-bdfb-4158145f7c49-kube-api-access-j8prh" (OuterVolumeSpecName: "kube-api-access-j8prh") pod "7efd3e75-902d-40cc-bdfb-4158145f7c49" (UID: "7efd3e75-902d-40cc-bdfb-4158145f7c49"). InnerVolumeSpecName "kube-api-access-j8prh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.490287 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7efd3e75-902d-40cc-bdfb-4158145f7c49" (UID: "7efd3e75-902d-40cc-bdfb-4158145f7c49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.520535 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-config-data" (OuterVolumeSpecName: "config-data") pod "7efd3e75-902d-40cc-bdfb-4158145f7c49" (UID: "7efd3e75-902d-40cc-bdfb-4158145f7c49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.565908 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8prh\" (UniqueName: \"kubernetes.io/projected/7efd3e75-902d-40cc-bdfb-4158145f7c49-kube-api-access-j8prh\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.565943 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:49 crc kubenswrapper[4956]: I0314 09:35:49.565952 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efd3e75-902d-40cc-bdfb-4158145f7c49-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.317673 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.344842 4956 generic.go:334] "Generic (PLEG): container finished" podID="8fe2a92d-ff95-4ead-b54e-790a69b16631" containerID="487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69" exitCode=0 Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.344905 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8fe2a92d-ff95-4ead-b54e-790a69b16631","Type":"ContainerDied","Data":"487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69"} Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.344931 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8fe2a92d-ff95-4ead-b54e-790a69b16631","Type":"ContainerDied","Data":"18beac058aef1cc94d9a4c5227d795104d5d9007f362517cd2e6804c454f90ea"} Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.344948 4956 scope.go:117] "RemoveContainer" containerID="487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.345051 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.350501 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"7efd3e75-902d-40cc-bdfb-4158145f7c49","Type":"ContainerDied","Data":"0feeaa3a7016e195e46259b66f27b524d236418e8bd4cd2f877f228fba4e104b"} Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.350546 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.352925 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerStarted","Data":"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951"} Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.352953 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerStarted","Data":"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6"} Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.372969 4956 scope.go:117] "RemoveContainer" containerID="487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69" Mar 14 09:35:50 crc kubenswrapper[4956]: E0314 09:35:50.373437 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69\": container with ID starting with 487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69 not found: ID does not exist" containerID="487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.373756 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69"} err="failed to get container status \"487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69\": rpc error: code = NotFound desc = could not find container \"487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69\": container with ID starting with 487a19ff26075691c7ac5a93f380fb7c32bfa027fa19d18a6cc7c653dacbde69 not found: ID does not exist" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.373796 4956 scope.go:117] "RemoveContainer" containerID="4a49256f2f57a6f737c625a9b5947ead43d7e3c63b90657cc855d31c6b7b9e8b" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.389021 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.395141 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485247 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe2a92d-ff95-4ead-b54e-790a69b16631-logs\") pod \"8fe2a92d-ff95-4ead-b54e-790a69b16631\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485575 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-custom-prometheus-ca\") pod \"8fe2a92d-ff95-4ead-b54e-790a69b16631\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485600 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-config-data\") pod \"8fe2a92d-ff95-4ead-b54e-790a69b16631\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485616 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-combined-ca-bundle\") pod \"8fe2a92d-ff95-4ead-b54e-790a69b16631\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485609 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe2a92d-ff95-4ead-b54e-790a69b16631-logs" (OuterVolumeSpecName: "logs") pod "8fe2a92d-ff95-4ead-b54e-790a69b16631" (UID: "8fe2a92d-ff95-4ead-b54e-790a69b16631"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485644 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g548h\" (UniqueName: \"kubernetes.io/projected/8fe2a92d-ff95-4ead-b54e-790a69b16631-kube-api-access-g548h\") pod \"8fe2a92d-ff95-4ead-b54e-790a69b16631\" (UID: \"8fe2a92d-ff95-4ead-b54e-790a69b16631\") " Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.485859 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe2a92d-ff95-4ead-b54e-790a69b16631-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.493672 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe2a92d-ff95-4ead-b54e-790a69b16631-kube-api-access-g548h" (OuterVolumeSpecName: "kube-api-access-g548h") pod "8fe2a92d-ff95-4ead-b54e-790a69b16631" (UID: "8fe2a92d-ff95-4ead-b54e-790a69b16631"). InnerVolumeSpecName "kube-api-access-g548h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.546639 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe2a92d-ff95-4ead-b54e-790a69b16631" (UID: "8fe2a92d-ff95-4ead-b54e-790a69b16631"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.566767 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8fe2a92d-ff95-4ead-b54e-790a69b16631" (UID: "8fe2a92d-ff95-4ead-b54e-790a69b16631"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.588659 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.588682 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.588693 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g548h\" (UniqueName: \"kubernetes.io/projected/8fe2a92d-ff95-4ead-b54e-790a69b16631-kube-api-access-g548h\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.588955 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-config-data" (OuterVolumeSpecName: "config-data") pod "8fe2a92d-ff95-4ead-b54e-790a69b16631" (UID: "8fe2a92d-ff95-4ead-b54e-790a69b16631"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.632385 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kt2vf"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.638626 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-kt2vf"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.645751 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3add-account-delete-dfmh4"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.651811 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher3add-account-delete-dfmh4"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.661419 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3add-account-create-update-mfq8n"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.669164 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3add-account-create-update-mfq8n"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.676812 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.684309 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:35:50 crc kubenswrapper[4956]: I0314 09:35:50.689975 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2a92d-ff95-4ead-b54e-790a69b16631-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.220678 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44081b62-9e6e-40f1-8e17-4a4f42aa8cad" path="/var/lib/kubelet/pods/44081b62-9e6e-40f1-8e17-4a4f42aa8cad/volumes" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.221314 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e365158-1936-463b-bfaa-a8efd23f8591" path="/var/lib/kubelet/pods/7e365158-1936-463b-bfaa-a8efd23f8591/volumes" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.221992 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7efd3e75-902d-40cc-bdfb-4158145f7c49" path="/var/lib/kubelet/pods/7efd3e75-902d-40cc-bdfb-4158145f7c49/volumes" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.223166 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe2a92d-ff95-4ead-b54e-790a69b16631" path="/var/lib/kubelet/pods/8fe2a92d-ff95-4ead-b54e-790a69b16631/volumes" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.223780 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef22fc53-e58a-45b9-a255-8969c18c0667" path="/var/lib/kubelet/pods/ef22fc53-e58a-45b9-a255-8969c18c0667/volumes" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.678801 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-fv2n7"] Mar 14 09:35:51 crc kubenswrapper[4956]: E0314 09:35:51.679304 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe2a92d-ff95-4ead-b54e-790a69b16631" containerName="watcher-decision-engine" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.679317 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe2a92d-ff95-4ead-b54e-790a69b16631" containerName="watcher-decision-engine" Mar 14 09:35:51 crc kubenswrapper[4956]: E0314 09:35:51.679343 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efd3e75-902d-40cc-bdfb-4158145f7c49" containerName="watcher-applier" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.679349 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efd3e75-902d-40cc-bdfb-4158145f7c49" containerName="watcher-applier" Mar 14 09:35:51 crc kubenswrapper[4956]: E0314 09:35:51.679364 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44081b62-9e6e-40f1-8e17-4a4f42aa8cad" containerName="mariadb-account-delete" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.679371 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="44081b62-9e6e-40f1-8e17-4a4f42aa8cad" containerName="mariadb-account-delete" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.679532 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe2a92d-ff95-4ead-b54e-790a69b16631" containerName="watcher-decision-engine" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.679548 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="44081b62-9e6e-40f1-8e17-4a4f42aa8cad" containerName="mariadb-account-delete" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.679561 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efd3e75-902d-40cc-bdfb-4158145f7c49" containerName="watcher-applier" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.680067 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.693998 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fv2n7"] Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.770534 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-5cb7-account-create-update-972hv"] Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.771430 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.772898 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.783548 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-5cb7-account-create-update-972hv"] Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.806681 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vdjc\" (UniqueName: \"kubernetes.io/projected/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-kube-api-access-8vdjc\") pod \"watcher-db-create-fv2n7\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.806796 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-operator-scripts\") pod \"watcher-db-create-fv2n7\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.907812 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gwh\" (UniqueName: \"kubernetes.io/projected/a703874a-c31f-4e92-8b4a-2f3e28d31666-kube-api-access-x4gwh\") pod \"watcher-5cb7-account-create-update-972hv\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.907880 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-operator-scripts\") pod \"watcher-db-create-fv2n7\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.907903 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a703874a-c31f-4e92-8b4a-2f3e28d31666-operator-scripts\") pod \"watcher-5cb7-account-create-update-972hv\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.907965 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vdjc\" (UniqueName: \"kubernetes.io/projected/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-kube-api-access-8vdjc\") pod \"watcher-db-create-fv2n7\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.909199 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-operator-scripts\") pod \"watcher-db-create-fv2n7\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:51 crc kubenswrapper[4956]: I0314 09:35:51.923154 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vdjc\" (UniqueName: \"kubernetes.io/projected/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-kube-api-access-8vdjc\") pod \"watcher-db-create-fv2n7\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.008380 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.008821 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gwh\" (UniqueName: \"kubernetes.io/projected/a703874a-c31f-4e92-8b4a-2f3e28d31666-kube-api-access-x4gwh\") pod \"watcher-5cb7-account-create-update-972hv\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.008885 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a703874a-c31f-4e92-8b4a-2f3e28d31666-operator-scripts\") pod \"watcher-5cb7-account-create-update-972hv\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.009531 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a703874a-c31f-4e92-8b4a-2f3e28d31666-operator-scripts\") pod \"watcher-5cb7-account-create-update-972hv\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.026653 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gwh\" (UniqueName: \"kubernetes.io/projected/a703874a-c31f-4e92-8b4a-2f3e28d31666-kube-api-access-x4gwh\") pod \"watcher-5cb7-account-create-update-972hv\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.086627 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.242673 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fv2n7"] Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.385254 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-fv2n7" event={"ID":"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2","Type":"ContainerStarted","Data":"b6b9f4722e310a8c05fdf5cfa22c03c4739a98932a7508c17fc110fde936a146"} Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.391011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerStarted","Data":"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b"} Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.391201 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-central-agent" containerID="cri-o://5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" gracePeriod=30 Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.391431 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-notification-agent" containerID="cri-o://c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" gracePeriod=30 Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.391462 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.391517 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="proxy-httpd" containerID="cri-o://4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" gracePeriod=30 Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.391903 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="sg-core" containerID="cri-o://9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" gracePeriod=30 Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.425294 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.160711762 podStartE2EDuration="5.425274935s" podCreationTimestamp="2026-03-14 09:35:47 +0000 UTC" firstStartedPulling="2026-03-14 09:35:48.240574117 +0000 UTC m=+2353.753266385" lastFinishedPulling="2026-03-14 09:35:51.50513729 +0000 UTC m=+2357.017829558" observedRunningTime="2026-03-14 09:35:52.414974813 +0000 UTC m=+2357.927667081" watchObservedRunningTime="2026-03-14 09:35:52.425274935 +0000 UTC m=+2357.937967193" Mar 14 09:35:52 crc kubenswrapper[4956]: I0314 09:35:52.635801 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-5cb7-account-create-update-972hv"] Mar 14 09:35:52 crc kubenswrapper[4956]: W0314 09:35:52.651068 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda703874a_c31f_4e92_8b4a_2f3e28d31666.slice/crio-034c66c10bc49563177e684e5bc127802457cc11fd2733fba5200e443de6ba0c WatchSource:0}: Error finding container 034c66c10bc49563177e684e5bc127802457cc11fd2733fba5200e443de6ba0c: Status 404 returned error can't find the container with id 034c66c10bc49563177e684e5bc127802457cc11fd2733fba5200e443de6ba0c Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.121143 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226032 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-ceilometer-tls-certs\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226102 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-config-data\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226226 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-log-httpd\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226248 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-scripts\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226306 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxvbf\" (UniqueName: \"kubernetes.io/projected/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-kube-api-access-hxvbf\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226334 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-combined-ca-bundle\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226374 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-run-httpd\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226398 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-sg-core-conf-yaml\") pod \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\" (UID: \"e6b21ba4-2690-4688-b3df-2fe8d53e36aa\") " Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226690 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.226906 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.231341 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-kube-api-access-hxvbf" (OuterVolumeSpecName: "kube-api-access-hxvbf") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "kube-api-access-hxvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.236070 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-scripts" (OuterVolumeSpecName: "scripts") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.249179 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.268035 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.287406 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.303942 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-config-data" (OuterVolumeSpecName: "config-data") pod "e6b21ba4-2690-4688-b3df-2fe8d53e36aa" (UID: "e6b21ba4-2690-4688-b3df-2fe8d53e36aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328561 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328595 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328604 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328616 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxvbf\" (UniqueName: \"kubernetes.io/projected/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-kube-api-access-hxvbf\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328627 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328634 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328642 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.328652 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b21ba4-2690-4688-b3df-2fe8d53e36aa-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.404479 4956 generic.go:334] "Generic (PLEG): container finished" podID="a703874a-c31f-4e92-8b4a-2f3e28d31666" containerID="9d2f1d2c6d31fbb3c8b5a14804de445ff867d0054bc3dc64075edec8f4ace304" exitCode=0 Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.404528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" event={"ID":"a703874a-c31f-4e92-8b4a-2f3e28d31666","Type":"ContainerDied","Data":"9d2f1d2c6d31fbb3c8b5a14804de445ff867d0054bc3dc64075edec8f4ace304"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.404572 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" event={"ID":"a703874a-c31f-4e92-8b4a-2f3e28d31666","Type":"ContainerStarted","Data":"034c66c10bc49563177e684e5bc127802457cc11fd2733fba5200e443de6ba0c"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.407490 4956 generic.go:334] "Generic (PLEG): container finished" podID="26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" containerID="162b46697deba9d151b09ac46230c77e3913f17249fc1a405592489c8669d5b5" exitCode=0 Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.407558 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-fv2n7" event={"ID":"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2","Type":"ContainerDied","Data":"162b46697deba9d151b09ac46230c77e3913f17249fc1a405592489c8669d5b5"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411131 4956 generic.go:334] "Generic (PLEG): container finished" podID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" exitCode=0 Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411171 4956 generic.go:334] "Generic (PLEG): container finished" podID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" exitCode=2 Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411184 4956 generic.go:334] "Generic (PLEG): container finished" podID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" exitCode=0 Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411194 4956 generic.go:334] "Generic (PLEG): container finished" podID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" exitCode=0 Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411225 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerDied","Data":"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411265 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerDied","Data":"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411276 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerDied","Data":"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411286 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerDied","Data":"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411294 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e6b21ba4-2690-4688-b3df-2fe8d53e36aa","Type":"ContainerDied","Data":"68662914d79b19c0ac826ea8e60a3eadffab1c566035d388612a31093029712c"} Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411318 4956 scope.go:117] "RemoveContainer" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.411748 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.442236 4956 scope.go:117] "RemoveContainer" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.465448 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.469359 4956 scope.go:117] "RemoveContainer" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.474542 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.503682 4956 scope.go:117] "RemoveContainer" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.517308 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.518207 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="proxy-httpd" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.518227 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="proxy-httpd" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.518255 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="sg-core" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.518262 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="sg-core" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.518278 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-notification-agent" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.518284 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-notification-agent" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.518308 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-central-agent" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.518313 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-central-agent" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.519248 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="proxy-httpd" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.519272 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-central-agent" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.519296 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="sg-core" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.519309 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" containerName="ceilometer-notification-agent" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.523214 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.526191 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.526823 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.526960 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.527348 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.534500 4956 scope.go:117] "RemoveContainer" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.534916 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": container with ID starting with 4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b not found: ID does not exist" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.534953 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b"} err="failed to get container status \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": rpc error: code = NotFound desc = could not find container \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": container with ID starting with 4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.534975 4956 scope.go:117] "RemoveContainer" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.535251 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": container with ID starting with 9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951 not found: ID does not exist" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.535307 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951"} err="failed to get container status \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": rpc error: code = NotFound desc = could not find container \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": container with ID starting with 9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.535345 4956 scope.go:117] "RemoveContainer" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.535751 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": container with ID starting with c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6 not found: ID does not exist" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.535795 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6"} err="failed to get container status \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": rpc error: code = NotFound desc = could not find container \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": container with ID starting with c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.535816 4956 scope.go:117] "RemoveContainer" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" Mar 14 09:35:53 crc kubenswrapper[4956]: E0314 09:35:53.536054 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": container with ID starting with 5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1 not found: ID does not exist" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.536079 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1"} err="failed to get container status \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": rpc error: code = NotFound desc = could not find container \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": container with ID starting with 5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.536096 4956 scope.go:117] "RemoveContainer" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.536673 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b"} err="failed to get container status \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": rpc error: code = NotFound desc = could not find container \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": container with ID starting with 4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.536700 4956 scope.go:117] "RemoveContainer" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.537438 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951"} err="failed to get container status \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": rpc error: code = NotFound desc = could not find container \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": container with ID starting with 9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.537461 4956 scope.go:117] "RemoveContainer" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.538823 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6"} err="failed to get container status \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": rpc error: code = NotFound desc = could not find container \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": container with ID starting with c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.538886 4956 scope.go:117] "RemoveContainer" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.539536 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1"} err="failed to get container status \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": rpc error: code = NotFound desc = could not find container \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": container with ID starting with 5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.539644 4956 scope.go:117] "RemoveContainer" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.540799 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b"} err="failed to get container status \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": rpc error: code = NotFound desc = could not find container \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": container with ID starting with 4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.540896 4956 scope.go:117] "RemoveContainer" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.544684 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951"} err="failed to get container status \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": rpc error: code = NotFound desc = could not find container \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": container with ID starting with 9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.544742 4956 scope.go:117] "RemoveContainer" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.547951 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6"} err="failed to get container status \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": rpc error: code = NotFound desc = could not find container \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": container with ID starting with c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.548007 4956 scope.go:117] "RemoveContainer" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.548389 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1"} err="failed to get container status \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": rpc error: code = NotFound desc = could not find container \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": container with ID starting with 5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.548415 4956 scope.go:117] "RemoveContainer" containerID="4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.548801 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b"} err="failed to get container status \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": rpc error: code = NotFound desc = could not find container \"4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b\": container with ID starting with 4144ec26a43b22bbeac57376a3e393dbdbae6a5c409bf2e1b069ec558a1ebc9b not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.548850 4956 scope.go:117] "RemoveContainer" containerID="9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.549169 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951"} err="failed to get container status \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": rpc error: code = NotFound desc = could not find container \"9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951\": container with ID starting with 9f261f8a7d2b5edbb12b646376eef1dcb1ab086e570e0a938dd20d2aa1222951 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.549200 4956 scope.go:117] "RemoveContainer" containerID="c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.549551 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6"} err="failed to get container status \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": rpc error: code = NotFound desc = could not find container \"c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6\": container with ID starting with c944422e03ca8c514f0833fd24d9d410a474703afec05e9e36eeb28925b03ad6 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.549582 4956 scope.go:117] "RemoveContainer" containerID="5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.549908 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1"} err="failed to get container status \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": rpc error: code = NotFound desc = could not find container \"5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1\": container with ID starting with 5f9edd7a46a4d14fd23afd2d2195beab9fa486fa5f17f07786ec74f9a5908be1 not found: ID does not exist" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634106 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634180 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-config-data\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634242 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-log-httpd\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634270 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-scripts\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634300 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cr2c\" (UniqueName: \"kubernetes.io/projected/05536f1a-0f62-4ddb-93a0-85429c9676ee-kube-api-access-7cr2c\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634435 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634504 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-run-httpd\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.634560 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.736252 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.736936 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-config-data\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.736985 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-log-httpd\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.737018 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cr2c\" (UniqueName: \"kubernetes.io/projected/05536f1a-0f62-4ddb-93a0-85429c9676ee-kube-api-access-7cr2c\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.737034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-scripts\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.737064 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.737079 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-run-httpd\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.737100 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.738416 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-run-httpd\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.738728 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-log-httpd\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.740165 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.740384 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.741050 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.741967 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-config-data\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.742458 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-scripts\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.753855 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cr2c\" (UniqueName: \"kubernetes.io/projected/05536f1a-0f62-4ddb-93a0-85429c9676ee-kube-api-access-7cr2c\") pod \"ceilometer-0\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:53 crc kubenswrapper[4956]: I0314 09:35:53.846187 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.307470 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.421558 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerStarted","Data":"0a78c77183eec02d9837d2ca48e647b0e31713cbb604d45912cdcb7a98da4e86"} Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.786966 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.792032 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.961149 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vdjc\" (UniqueName: \"kubernetes.io/projected/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-kube-api-access-8vdjc\") pod \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.961340 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-operator-scripts\") pod \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\" (UID: \"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2\") " Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.961388 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gwh\" (UniqueName: \"kubernetes.io/projected/a703874a-c31f-4e92-8b4a-2f3e28d31666-kube-api-access-x4gwh\") pod \"a703874a-c31f-4e92-8b4a-2f3e28d31666\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.961508 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a703874a-c31f-4e92-8b4a-2f3e28d31666-operator-scripts\") pod \"a703874a-c31f-4e92-8b4a-2f3e28d31666\" (UID: \"a703874a-c31f-4e92-8b4a-2f3e28d31666\") " Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.962771 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a703874a-c31f-4e92-8b4a-2f3e28d31666-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a703874a-c31f-4e92-8b4a-2f3e28d31666" (UID: "a703874a-c31f-4e92-8b4a-2f3e28d31666"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.962890 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" (UID: "26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.964780 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-kube-api-access-8vdjc" (OuterVolumeSpecName: "kube-api-access-8vdjc") pod "26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" (UID: "26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2"). InnerVolumeSpecName "kube-api-access-8vdjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:54 crc kubenswrapper[4956]: I0314 09:35:54.967884 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a703874a-c31f-4e92-8b4a-2f3e28d31666-kube-api-access-x4gwh" (OuterVolumeSpecName: "kube-api-access-x4gwh") pod "a703874a-c31f-4e92-8b4a-2f3e28d31666" (UID: "a703874a-c31f-4e92-8b4a-2f3e28d31666"). InnerVolumeSpecName "kube-api-access-x4gwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.062987 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.063799 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gwh\" (UniqueName: \"kubernetes.io/projected/a703874a-c31f-4e92-8b4a-2f3e28d31666-kube-api-access-x4gwh\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.063819 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a703874a-c31f-4e92-8b4a-2f3e28d31666-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.063828 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vdjc\" (UniqueName: \"kubernetes.io/projected/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2-kube-api-access-8vdjc\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.220038 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b21ba4-2690-4688-b3df-2fe8d53e36aa" path="/var/lib/kubelet/pods/e6b21ba4-2690-4688-b3df-2fe8d53e36aa/volumes" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.431852 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-fv2n7" event={"ID":"26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2","Type":"ContainerDied","Data":"b6b9f4722e310a8c05fdf5cfa22c03c4739a98932a7508c17fc110fde936a146"} Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.431901 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b9f4722e310a8c05fdf5cfa22c03c4739a98932a7508c17fc110fde936a146" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.431962 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-fv2n7" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.435095 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" event={"ID":"a703874a-c31f-4e92-8b4a-2f3e28d31666","Type":"ContainerDied","Data":"034c66c10bc49563177e684e5bc127802457cc11fd2733fba5200e443de6ba0c"} Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.435139 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="034c66c10bc49563177e684e5bc127802457cc11fd2733fba5200e443de6ba0c" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.435213 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-5cb7-account-create-update-972hv" Mar 14 09:35:55 crc kubenswrapper[4956]: I0314 09:35:55.437243 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerStarted","Data":"68421e69b28a8cb4db9a5d2676dff5bfa7589188decb5f161692a04bc11623d1"} Mar 14 09:35:56 crc kubenswrapper[4956]: I0314 09:35:56.449619 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerStarted","Data":"30a29513ca0cab96cf0f62fa5e5fe5620a39260187952f5c06e858fdcd2b23a3"} Mar 14 09:35:56 crc kubenswrapper[4956]: I0314 09:35:56.449661 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerStarted","Data":"3f895236543aa337b10720419d402edb5803dc4b8194157744f1817b438202fb"} Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.118629 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2"] Mar 14 09:35:57 crc kubenswrapper[4956]: E0314 09:35:57.118987 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" containerName="mariadb-database-create" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.119005 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" containerName="mariadb-database-create" Mar 14 09:35:57 crc kubenswrapper[4956]: E0314 09:35:57.119027 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a703874a-c31f-4e92-8b4a-2f3e28d31666" containerName="mariadb-account-create-update" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.119035 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a703874a-c31f-4e92-8b4a-2f3e28d31666" containerName="mariadb-account-create-update" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.119194 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" containerName="mariadb-database-create" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.119220 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a703874a-c31f-4e92-8b4a-2f3e28d31666" containerName="mariadb-account-create-update" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.119724 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.121950 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-bpb9f" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.122734 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.151903 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2"] Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.204957 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.205021 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntxj8\" (UniqueName: \"kubernetes.io/projected/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-kube-api-access-ntxj8\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.205049 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-db-sync-config-data\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.205145 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-config-data\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.306043 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-config-data\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.306348 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.306506 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntxj8\" (UniqueName: \"kubernetes.io/projected/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-kube-api-access-ntxj8\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.306605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-db-sync-config-data\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.311283 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-config-data\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.312048 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.323257 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-db-sync-config-data\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.325235 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntxj8\" (UniqueName: \"kubernetes.io/projected/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-kube-api-access-ntxj8\") pod \"watcher-kuttl-db-sync-fhbq2\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.448446 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:35:57 crc kubenswrapper[4956]: I0314 09:35:57.769244 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2"] Mar 14 09:35:58 crc kubenswrapper[4956]: I0314 09:35:58.477956 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerStarted","Data":"6763e394fe4c0e25d15a5dd2520514161dbe0a66dc64f6ee68fb73bc7d56f420"} Mar 14 09:35:58 crc kubenswrapper[4956]: I0314 09:35:58.479727 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:35:58 crc kubenswrapper[4956]: I0314 09:35:58.482090 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" event={"ID":"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c","Type":"ContainerStarted","Data":"531ab46c4edb095fea3a136af494feba3e4ea88c168d6565d7a3e64600d61f72"} Mar 14 09:35:58 crc kubenswrapper[4956]: I0314 09:35:58.482214 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" event={"ID":"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c","Type":"ContainerStarted","Data":"4a1b88cc259cf5c2bdecc0590de6392ac1c8b9e2291f980733cfd9d008d209c7"} Mar 14 09:35:58 crc kubenswrapper[4956]: I0314 09:35:58.504466 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.10400385 podStartE2EDuration="5.504446412s" podCreationTimestamp="2026-03-14 09:35:53 +0000 UTC" firstStartedPulling="2026-03-14 09:35:54.307669656 +0000 UTC m=+2359.820361914" lastFinishedPulling="2026-03-14 09:35:57.708112208 +0000 UTC m=+2363.220804476" observedRunningTime="2026-03-14 09:35:58.501762334 +0000 UTC m=+2364.014454602" watchObservedRunningTime="2026-03-14 09:35:58.504446412 +0000 UTC m=+2364.017138680" Mar 14 09:35:58 crc kubenswrapper[4956]: I0314 09:35:58.520628 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" podStartSLOduration=1.520607933 podStartE2EDuration="1.520607933s" podCreationTimestamp="2026-03-14 09:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:35:58.518466669 +0000 UTC m=+2364.031158937" watchObservedRunningTime="2026-03-14 09:35:58.520607933 +0000 UTC m=+2364.033300221" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.132270 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558016-2cg4f"] Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.133446 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.135599 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.135943 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.136447 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.147467 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-2cg4f"] Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.252876 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfqtw\" (UniqueName: \"kubernetes.io/projected/414a44f7-a0b0-4381-b2be-0f118c9c1106-kube-api-access-lfqtw\") pod \"auto-csr-approver-29558016-2cg4f\" (UID: \"414a44f7-a0b0-4381-b2be-0f118c9c1106\") " pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.355265 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfqtw\" (UniqueName: \"kubernetes.io/projected/414a44f7-a0b0-4381-b2be-0f118c9c1106-kube-api-access-lfqtw\") pod \"auto-csr-approver-29558016-2cg4f\" (UID: \"414a44f7-a0b0-4381-b2be-0f118c9c1106\") " pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.375407 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfqtw\" (UniqueName: \"kubernetes.io/projected/414a44f7-a0b0-4381-b2be-0f118c9c1106-kube-api-access-lfqtw\") pod \"auto-csr-approver-29558016-2cg4f\" (UID: \"414a44f7-a0b0-4381-b2be-0f118c9c1106\") " pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.471788 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.497801 4956 generic.go:334] "Generic (PLEG): container finished" podID="1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" containerID="531ab46c4edb095fea3a136af494feba3e4ea88c168d6565d7a3e64600d61f72" exitCode=0 Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.498610 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" event={"ID":"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c","Type":"ContainerDied","Data":"531ab46c4edb095fea3a136af494feba3e4ea88c168d6565d7a3e64600d61f72"} Mar 14 09:36:00 crc kubenswrapper[4956]: I0314 09:36:00.913584 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-2cg4f"] Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.210022 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:36:01 crc kubenswrapper[4956]: E0314 09:36:01.210231 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.506126 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" event={"ID":"414a44f7-a0b0-4381-b2be-0f118c9c1106","Type":"ContainerStarted","Data":"331e5c484c889753026de2f1a57c4d167b3a7bafa5b6155ac9175d0ed7aa7fc3"} Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.815812 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.979553 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-db-sync-config-data\") pod \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.980623 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-combined-ca-bundle\") pod \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.980740 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-config-data\") pod \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " Mar 14 09:36:01 crc kubenswrapper[4956]: I0314 09:36:01.980963 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntxj8\" (UniqueName: \"kubernetes.io/projected/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-kube-api-access-ntxj8\") pod \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\" (UID: \"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c\") " Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.001659 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" (UID: "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.001727 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-kube-api-access-ntxj8" (OuterVolumeSpecName: "kube-api-access-ntxj8") pod "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" (UID: "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c"). InnerVolumeSpecName "kube-api-access-ntxj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.012129 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" (UID: "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.034297 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-config-data" (OuterVolumeSpecName: "config-data") pod "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" (UID: "1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.085463 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntxj8\" (UniqueName: \"kubernetes.io/projected/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-kube-api-access-ntxj8\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.085516 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.085527 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.085538 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.515973 4956 generic.go:334] "Generic (PLEG): container finished" podID="414a44f7-a0b0-4381-b2be-0f118c9c1106" containerID="ba8dda9e3ebef2cfdcecc1d852fe29f813806b422aeff407f88a4df3837d64e6" exitCode=0 Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.516322 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" event={"ID":"414a44f7-a0b0-4381-b2be-0f118c9c1106","Type":"ContainerDied","Data":"ba8dda9e3ebef2cfdcecc1d852fe29f813806b422aeff407f88a4df3837d64e6"} Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.518871 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" event={"ID":"1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c","Type":"ContainerDied","Data":"4a1b88cc259cf5c2bdecc0590de6392ac1c8b9e2291f980733cfd9d008d209c7"} Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.518926 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1b88cc259cf5c2bdecc0590de6392ac1c8b9e2291f980733cfd9d008d209c7" Mar 14 09:36:02 crc kubenswrapper[4956]: I0314 09:36:02.518971 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.054397 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: E0314 09:36:03.054761 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" containerName="watcher-kuttl-db-sync" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.054772 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" containerName="watcher-kuttl-db-sync" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.054908 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" containerName="watcher-kuttl-db-sync" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.055428 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.058043 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-bpb9f" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.058461 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.067597 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.098967 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.099015 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.099039 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p4g7\" (UniqueName: \"kubernetes.io/projected/f1dc470d-c357-4a37-96ef-bfaf42513d47-kube-api-access-8p4g7\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.099289 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.099423 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1dc470d-c357-4a37-96ef-bfaf42513d47-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.134144 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.135750 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.137878 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.147589 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.148933 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.150559 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.153260 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.162425 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201153 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prj59\" (UniqueName: \"kubernetes.io/projected/1d17d668-aef6-4d0f-a650-0c105246b536-kube-api-access-prj59\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201397 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201589 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1dc470d-c357-4a37-96ef-bfaf42513d47-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201702 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201784 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201871 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.201946 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p4g7\" (UniqueName: \"kubernetes.io/projected/f1dc470d-c357-4a37-96ef-bfaf42513d47-kube-api-access-8p4g7\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202004 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1dc470d-c357-4a37-96ef-bfaf42513d47-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202025 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-kube-api-access-x47f8\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202459 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202512 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202617 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d17d668-aef6-4d0f-a650-0c105246b536-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202670 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.202691 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.206522 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.206535 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.206855 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.216559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p4g7\" (UniqueName: \"kubernetes.io/projected/f1dc470d-c357-4a37-96ef-bfaf42513d47-kube-api-access-8p4g7\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.303705 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d17d668-aef6-4d0f-a650-0c105246b536-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.303779 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.303832 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prj59\" (UniqueName: \"kubernetes.io/projected/1d17d668-aef6-4d0f-a650-0c105246b536-kube-api-access-prj59\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.303878 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.303918 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.303981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.304046 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-kube-api-access-x47f8\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.304116 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.304151 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.304150 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d17d668-aef6-4d0f-a650-0c105246b536-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.304446 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.309079 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.309203 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.309229 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.309251 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.312303 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.321188 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-kube-api-access-x47f8\") pod \"watcher-kuttl-applier-0\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.322045 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prj59\" (UniqueName: \"kubernetes.io/projected/1d17d668-aef6-4d0f-a650-0c105246b536-kube-api-access-prj59\") pod \"watcher-kuttl-api-0\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.372093 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.454333 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.469350 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.836233 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:03 crc kubenswrapper[4956]: W0314 09:36:03.948667 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d17d668_aef6_4d0f_a650_0c105246b536.slice/crio-1ac35755fbbcfd8c7ea2f99dbe079e646bd124d66239132a56d9f10b9a2aa4e8 WatchSource:0}: Error finding container 1ac35755fbbcfd8c7ea2f99dbe079e646bd124d66239132a56d9f10b9a2aa4e8: Status 404 returned error can't find the container with id 1ac35755fbbcfd8c7ea2f99dbe079e646bd124d66239132a56d9f10b9a2aa4e8 Mar 14 09:36:03 crc kubenswrapper[4956]: I0314 09:36:03.961461 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.017342 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.100156 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.116085 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfqtw\" (UniqueName: \"kubernetes.io/projected/414a44f7-a0b0-4381-b2be-0f118c9c1106-kube-api-access-lfqtw\") pod \"414a44f7-a0b0-4381-b2be-0f118c9c1106\" (UID: \"414a44f7-a0b0-4381-b2be-0f118c9c1106\") " Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.119462 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414a44f7-a0b0-4381-b2be-0f118c9c1106-kube-api-access-lfqtw" (OuterVolumeSpecName: "kube-api-access-lfqtw") pod "414a44f7-a0b0-4381-b2be-0f118c9c1106" (UID: "414a44f7-a0b0-4381-b2be-0f118c9c1106"). InnerVolumeSpecName "kube-api-access-lfqtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.218256 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfqtw\" (UniqueName: \"kubernetes.io/projected/414a44f7-a0b0-4381-b2be-0f118c9c1106-kube-api-access-lfqtw\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.539342 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d4825c86-0535-4bfb-b0fd-cd5a69771dcf","Type":"ContainerStarted","Data":"286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.539407 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d4825c86-0535-4bfb-b0fd-cd5a69771dcf","Type":"ContainerStarted","Data":"cb2f655d984e4ecc8555f355d6ebb464e6615afa1e3e0e67ccd17fb9efa775ef"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.541462 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" event={"ID":"414a44f7-a0b0-4381-b2be-0f118c9c1106","Type":"ContainerDied","Data":"331e5c484c889753026de2f1a57c4d167b3a7bafa5b6155ac9175d0ed7aa7fc3"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.541509 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331e5c484c889753026de2f1a57c4d167b3a7bafa5b6155ac9175d0ed7aa7fc3" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.541561 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-2cg4f" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.552470 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f1dc470d-c357-4a37-96ef-bfaf42513d47","Type":"ContainerStarted","Data":"994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.552529 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f1dc470d-c357-4a37-96ef-bfaf42513d47","Type":"ContainerStarted","Data":"8c23dd28b717ca06c46ae4db94f14f56325349f598bf14f371652fac9bb95622"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.555181 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d17d668-aef6-4d0f-a650-0c105246b536","Type":"ContainerStarted","Data":"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.555209 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d17d668-aef6-4d0f-a650-0c105246b536","Type":"ContainerStarted","Data":"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.555219 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d17d668-aef6-4d0f-a650-0c105246b536","Type":"ContainerStarted","Data":"1ac35755fbbcfd8c7ea2f99dbe079e646bd124d66239132a56d9f10b9a2aa4e8"} Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.555499 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.557011 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.174:9322/\": dial tcp 10.217.0.174:9322: connect: connection refused" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.571429 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.571405649 podStartE2EDuration="1.571405649s" podCreationTimestamp="2026-03-14 09:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:04.564648757 +0000 UTC m=+2370.077341025" watchObservedRunningTime="2026-03-14 09:36:04.571405649 +0000 UTC m=+2370.084097927" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.593443 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.59342365 podStartE2EDuration="1.59342365s" podCreationTimestamp="2026-03-14 09:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:04.58479827 +0000 UTC m=+2370.097490538" watchObservedRunningTime="2026-03-14 09:36:04.59342365 +0000 UTC m=+2370.106115918" Mar 14 09:36:04 crc kubenswrapper[4956]: I0314 09:36:04.608248 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.608225466 podStartE2EDuration="1.608225466s" podCreationTimestamp="2026-03-14 09:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:04.605531948 +0000 UTC m=+2370.118224216" watchObservedRunningTime="2026-03-14 09:36:04.608225466 +0000 UTC m=+2370.120917734" Mar 14 09:36:05 crc kubenswrapper[4956]: I0314 09:36:05.072112 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-9n578"] Mar 14 09:36:05 crc kubenswrapper[4956]: I0314 09:36:05.082451 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-9n578"] Mar 14 09:36:05 crc kubenswrapper[4956]: I0314 09:36:05.221949 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5322a936-a1ad-4a11-8264-7f6d088026ac" path="/var/lib/kubelet/pods/5322a936-a1ad-4a11-8264-7f6d088026ac/volumes" Mar 14 09:36:07 crc kubenswrapper[4956]: I0314 09:36:07.779930 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:08 crc kubenswrapper[4956]: I0314 09:36:08.456144 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:08 crc kubenswrapper[4956]: I0314 09:36:08.470161 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.373347 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.414411 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.455980 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.470869 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.485855 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.652237 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.656600 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.663550 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.687887 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:13 crc kubenswrapper[4956]: I0314 09:36:13.691241 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:14 crc kubenswrapper[4956]: I0314 09:36:14.901906 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:14 crc kubenswrapper[4956]: I0314 09:36:14.902158 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-central-agent" containerID="cri-o://68421e69b28a8cb4db9a5d2676dff5bfa7589188decb5f161692a04bc11623d1" gracePeriod=30 Mar 14 09:36:14 crc kubenswrapper[4956]: I0314 09:36:14.902808 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="proxy-httpd" containerID="cri-o://6763e394fe4c0e25d15a5dd2520514161dbe0a66dc64f6ee68fb73bc7d56f420" gracePeriod=30 Mar 14 09:36:14 crc kubenswrapper[4956]: I0314 09:36:14.902827 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="sg-core" containerID="cri-o://30a29513ca0cab96cf0f62fa5e5fe5620a39260187952f5c06e858fdcd2b23a3" gracePeriod=30 Mar 14 09:36:14 crc kubenswrapper[4956]: I0314 09:36:14.902897 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-notification-agent" containerID="cri-o://3f895236543aa337b10720419d402edb5803dc4b8194157744f1817b438202fb" gracePeriod=30 Mar 14 09:36:14 crc kubenswrapper[4956]: I0314 09:36:14.912888 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.170:3000/\": EOF" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.216214 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:36:15 crc kubenswrapper[4956]: E0314 09:36:15.216864 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.240210 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2"] Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.253648 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-fhbq2"] Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.260007 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher5cb7-account-delete-kdql5"] Mar 14 09:36:15 crc kubenswrapper[4956]: E0314 09:36:15.260379 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414a44f7-a0b0-4381-b2be-0f118c9c1106" containerName="oc" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.260401 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="414a44f7-a0b0-4381-b2be-0f118c9c1106" containerName="oc" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.260584 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="414a44f7-a0b0-4381-b2be-0f118c9c1106" containerName="oc" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.261224 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.270652 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher5cb7-account-delete-kdql5"] Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.369314 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.389861 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.406060 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r587\" (UniqueName: \"kubernetes.io/projected/4007546c-563d-466f-a611-5b440804d94d-kube-api-access-4r587\") pod \"watcher5cb7-account-delete-kdql5\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.406126 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4007546c-563d-466f-a611-5b440804d94d-operator-scripts\") pod \"watcher5cb7-account-delete-kdql5\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.407299 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.508151 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r587\" (UniqueName: \"kubernetes.io/projected/4007546c-563d-466f-a611-5b440804d94d-kube-api-access-4r587\") pod \"watcher5cb7-account-delete-kdql5\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.508207 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4007546c-563d-466f-a611-5b440804d94d-operator-scripts\") pod \"watcher5cb7-account-delete-kdql5\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.508997 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4007546c-563d-466f-a611-5b440804d94d-operator-scripts\") pod \"watcher5cb7-account-delete-kdql5\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.529622 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r587\" (UniqueName: \"kubernetes.io/projected/4007546c-563d-466f-a611-5b440804d94d-kube-api-access-4r587\") pod \"watcher5cb7-account-delete-kdql5\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.576718 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.677791 4956 generic.go:334] "Generic (PLEG): container finished" podID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerID="6763e394fe4c0e25d15a5dd2520514161dbe0a66dc64f6ee68fb73bc7d56f420" exitCode=0 Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.677818 4956 generic.go:334] "Generic (PLEG): container finished" podID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerID="30a29513ca0cab96cf0f62fa5e5fe5620a39260187952f5c06e858fdcd2b23a3" exitCode=2 Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.677826 4956 generic.go:334] "Generic (PLEG): container finished" podID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerID="68421e69b28a8cb4db9a5d2676dff5bfa7589188decb5f161692a04bc11623d1" exitCode=0 Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.677994 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" containerName="watcher-applier" containerID="cri-o://286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" gracePeriod=30 Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.678281 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerDied","Data":"6763e394fe4c0e25d15a5dd2520514161dbe0a66dc64f6ee68fb73bc7d56f420"} Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.678307 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerDied","Data":"30a29513ca0cab96cf0f62fa5e5fe5620a39260187952f5c06e858fdcd2b23a3"} Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.678319 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerDied","Data":"68421e69b28a8cb4db9a5d2676dff5bfa7589188decb5f161692a04bc11623d1"} Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.678721 4956 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-bpb9f\" not found" Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.679051 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-kuttl-api-log" containerID="cri-o://5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e" gracePeriod=30 Mar 14 09:36:15 crc kubenswrapper[4956]: I0314 09:36:15.679120 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-api" containerID="cri-o://84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d" gracePeriod=30 Mar 14 09:36:15 crc kubenswrapper[4956]: E0314 09:36:15.714091 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:15 crc kubenswrapper[4956]: E0314 09:36:15.714180 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data podName:f1dc470d-c357-4a37-96ef-bfaf42513d47 nodeName:}" failed. No retries permitted until 2026-03-14 09:36:16.214156499 +0000 UTC m=+2381.726848767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:16 crc kubenswrapper[4956]: W0314 09:36:16.071836 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4007546c_563d_466f_a611_5b440804d94d.slice/crio-a40def40ec17c383bb9c9363db4c757c8323c5d0aa6706bc5bbeb89819f2dc83 WatchSource:0}: Error finding container a40def40ec17c383bb9c9363db4c757c8323c5d0aa6706bc5bbeb89819f2dc83: Status 404 returned error can't find the container with id a40def40ec17c383bb9c9363db4c757c8323c5d0aa6706bc5bbeb89819f2dc83 Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.080435 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher5cb7-account-delete-kdql5"] Mar 14 09:36:16 crc kubenswrapper[4956]: E0314 09:36:16.221638 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:16 crc kubenswrapper[4956]: E0314 09:36:16.222925 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data podName:f1dc470d-c357-4a37-96ef-bfaf42513d47 nodeName:}" failed. No retries permitted until 2026-03-14 09:36:17.222894945 +0000 UTC m=+2382.735587213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.634105 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.694108 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" event={"ID":"4007546c-563d-466f-a611-5b440804d94d","Type":"ContainerStarted","Data":"3fdf7b4e03c7e389dae3f007121417b911ea4f71fdad56e9fe34e31bbfd30e02"} Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.694156 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" event={"ID":"4007546c-563d-466f-a611-5b440804d94d","Type":"ContainerStarted","Data":"a40def40ec17c383bb9c9363db4c757c8323c5d0aa6706bc5bbeb89819f2dc83"} Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.703747 4956 generic.go:334] "Generic (PLEG): container finished" podID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerID="3f895236543aa337b10720419d402edb5803dc4b8194157744f1817b438202fb" exitCode=0 Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.703825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerDied","Data":"3f895236543aa337b10720419d402edb5803dc4b8194157744f1817b438202fb"} Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.706789 4956 generic.go:334] "Generic (PLEG): container finished" podID="1d17d668-aef6-4d0f-a650-0c105246b536" containerID="84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d" exitCode=0 Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.706811 4956 generic.go:334] "Generic (PLEG): container finished" podID="1d17d668-aef6-4d0f-a650-0c105246b536" containerID="5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e" exitCode=143 Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.706954 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f1dc470d-c357-4a37-96ef-bfaf42513d47" containerName="watcher-decision-engine" containerID="cri-o://994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52" gracePeriod=30 Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.707199 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.707519 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d17d668-aef6-4d0f-a650-0c105246b536","Type":"ContainerDied","Data":"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d"} Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.707541 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d17d668-aef6-4d0f-a650-0c105246b536","Type":"ContainerDied","Data":"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e"} Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.707550 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d17d668-aef6-4d0f-a650-0c105246b536","Type":"ContainerDied","Data":"1ac35755fbbcfd8c7ea2f99dbe079e646bd124d66239132a56d9f10b9a2aa4e8"} Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.707573 4956 scope.go:117] "RemoveContainer" containerID="84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.708424 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" podStartSLOduration=1.70841569 podStartE2EDuration="1.70841569s" podCreationTimestamp="2026-03-14 09:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:16.706708997 +0000 UTC m=+2382.219401265" watchObservedRunningTime="2026-03-14 09:36:16.70841569 +0000 UTC m=+2382.221107958" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.737341 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.748066 4956 scope.go:117] "RemoveContainer" containerID="5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.809576 4956 scope.go:117] "RemoveContainer" containerID="84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d" Mar 14 09:36:16 crc kubenswrapper[4956]: E0314 09:36:16.810409 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d\": container with ID starting with 84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d not found: ID does not exist" containerID="84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.810502 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d"} err="failed to get container status \"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d\": rpc error: code = NotFound desc = could not find container \"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d\": container with ID starting with 84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d not found: ID does not exist" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.810539 4956 scope.go:117] "RemoveContainer" containerID="5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e" Mar 14 09:36:16 crc kubenswrapper[4956]: E0314 09:36:16.811083 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e\": container with ID starting with 5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e not found: ID does not exist" containerID="5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.811131 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e"} err="failed to get container status \"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e\": rpc error: code = NotFound desc = could not find container \"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e\": container with ID starting with 5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e not found: ID does not exist" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.811166 4956 scope.go:117] "RemoveContainer" containerID="84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.811879 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d"} err="failed to get container status \"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d\": rpc error: code = NotFound desc = could not find container \"84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d\": container with ID starting with 84c3ac6d4d5fcec4bec0bb81bf974069ad5bc97efb0e99438ab5cd28961b431d not found: ID does not exist" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.811920 4956 scope.go:117] "RemoveContainer" containerID="5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.812206 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e"} err="failed to get container status \"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e\": rpc error: code = NotFound desc = could not find container \"5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e\": container with ID starting with 5e2c9298720a2dccc8f063c8536d2033b3ea9f2f80fc3727b4870d4c889ebc9e not found: ID does not exist" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.835621 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-run-httpd\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.835691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d17d668-aef6-4d0f-a650-0c105246b536-logs\") pod \"1d17d668-aef6-4d0f-a650-0c105246b536\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.835730 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cr2c\" (UniqueName: \"kubernetes.io/projected/05536f1a-0f62-4ddb-93a0-85429c9676ee-kube-api-access-7cr2c\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.835752 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-scripts\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836120 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836301 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d17d668-aef6-4d0f-a650-0c105246b536-logs" (OuterVolumeSpecName: "logs") pod "1d17d668-aef6-4d0f-a650-0c105246b536" (UID: "1d17d668-aef6-4d0f-a650-0c105246b536"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836847 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-sg-core-conf-yaml\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836881 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-combined-ca-bundle\") pod \"1d17d668-aef6-4d0f-a650-0c105246b536\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836914 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prj59\" (UniqueName: \"kubernetes.io/projected/1d17d668-aef6-4d0f-a650-0c105246b536-kube-api-access-prj59\") pod \"1d17d668-aef6-4d0f-a650-0c105246b536\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836940 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-ceilometer-tls-certs\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.836980 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-log-httpd\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.837033 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-custom-prometheus-ca\") pod \"1d17d668-aef6-4d0f-a650-0c105246b536\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.837059 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-config-data\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.837114 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-combined-ca-bundle\") pod \"05536f1a-0f62-4ddb-93a0-85429c9676ee\" (UID: \"05536f1a-0f62-4ddb-93a0-85429c9676ee\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.837147 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-config-data\") pod \"1d17d668-aef6-4d0f-a650-0c105246b536\" (UID: \"1d17d668-aef6-4d0f-a650-0c105246b536\") " Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.837513 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.837571 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d17d668-aef6-4d0f-a650-0c105246b536-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.838265 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.844544 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05536f1a-0f62-4ddb-93a0-85429c9676ee-kube-api-access-7cr2c" (OuterVolumeSpecName: "kube-api-access-7cr2c") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "kube-api-access-7cr2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.846300 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-scripts" (OuterVolumeSpecName: "scripts") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.855659 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d17d668-aef6-4d0f-a650-0c105246b536-kube-api-access-prj59" (OuterVolumeSpecName: "kube-api-access-prj59") pod "1d17d668-aef6-4d0f-a650-0c105246b536" (UID: "1d17d668-aef6-4d0f-a650-0c105246b536"). InnerVolumeSpecName "kube-api-access-prj59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.872452 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d17d668-aef6-4d0f-a650-0c105246b536" (UID: "1d17d668-aef6-4d0f-a650-0c105246b536"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.916630 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1d17d668-aef6-4d0f-a650-0c105246b536" (UID: "1d17d668-aef6-4d0f-a650-0c105246b536"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.922560 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.937873 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939057 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05536f1a-0f62-4ddb-93a0-85429c9676ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939082 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939096 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cr2c\" (UniqueName: \"kubernetes.io/projected/05536f1a-0f62-4ddb-93a0-85429c9676ee-kube-api-access-7cr2c\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939104 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939114 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939126 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939134 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prj59\" (UniqueName: \"kubernetes.io/projected/1d17d668-aef6-4d0f-a650-0c105246b536-kube-api-access-prj59\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.939142 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.955028 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-config-data" (OuterVolumeSpecName: "config-data") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.956569 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05536f1a-0f62-4ddb-93a0-85429c9676ee" (UID: "05536f1a-0f62-4ddb-93a0-85429c9676ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:16 crc kubenswrapper[4956]: I0314 09:36:16.961216 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-config-data" (OuterVolumeSpecName: "config-data") pod "1d17d668-aef6-4d0f-a650-0c105246b536" (UID: "1d17d668-aef6-4d0f-a650-0c105246b536"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.039964 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.039993 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05536f1a-0f62-4ddb-93a0-85429c9676ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.040004 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d17d668-aef6-4d0f-a650-0c105246b536-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.098072 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.111064 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.220301 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" path="/var/lib/kubelet/pods/1d17d668-aef6-4d0f-a650-0c105246b536/volumes" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.220984 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c" path="/var/lib/kubelet/pods/1f8e3158-ecb5-43f2-bcd8-34d9810e7a7c/volumes" Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.242781 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.242859 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data podName:f1dc470d-c357-4a37-96ef-bfaf42513d47 nodeName:}" failed. No retries permitted until 2026-03-14 09:36:19.24284185 +0000 UTC m=+2384.755534118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.722408 4956 generic.go:334] "Generic (PLEG): container finished" podID="4007546c-563d-466f-a611-5b440804d94d" containerID="3fdf7b4e03c7e389dae3f007121417b911ea4f71fdad56e9fe34e31bbfd30e02" exitCode=0 Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.722530 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" event={"ID":"4007546c-563d-466f-a611-5b440804d94d","Type":"ContainerDied","Data":"3fdf7b4e03c7e389dae3f007121417b911ea4f71fdad56e9fe34e31bbfd30e02"} Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.729113 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"05536f1a-0f62-4ddb-93a0-85429c9676ee","Type":"ContainerDied","Data":"0a78c77183eec02d9837d2ca48e647b0e31713cbb604d45912cdcb7a98da4e86"} Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.729179 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.729273 4956 scope.go:117] "RemoveContainer" containerID="6763e394fe4c0e25d15a5dd2520514161dbe0a66dc64f6ee68fb73bc7d56f420" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.766796 4956 scope.go:117] "RemoveContainer" containerID="30a29513ca0cab96cf0f62fa5e5fe5620a39260187952f5c06e858fdcd2b23a3" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.776678 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.784443 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.798543 4956 scope.go:117] "RemoveContainer" containerID="3f895236543aa337b10720419d402edb5803dc4b8194157744f1817b438202fb" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.830610 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.830998 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-notification-agent" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831015 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-notification-agent" Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.831028 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="sg-core" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831034 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="sg-core" Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.831046 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-api" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831052 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-api" Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.831060 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-central-agent" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831066 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-central-agent" Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.831081 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-kuttl-api-log" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831087 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-kuttl-api-log" Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.831103 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="proxy-httpd" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831109 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="proxy-httpd" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831291 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-api" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831306 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="sg-core" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831316 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-notification-agent" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831324 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="ceilometer-central-agent" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831335 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" containerName="proxy-httpd" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.831341 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d17d668-aef6-4d0f-a650-0c105246b536" containerName="watcher-kuttl-api-log" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.832721 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.836256 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.837673 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.837861 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.837980 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.841382 4956 scope.go:117] "RemoveContainer" containerID="68421e69b28a8cb4db9a5d2676dff5bfa7589188decb5f161692a04bc11623d1" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.935936 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:17 crc kubenswrapper[4956]: E0314 09:36:17.937085 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-2s889 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/ceilometer-0" podUID="52593735-5960-49ec-b45b-ab63715c1ed9" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953169 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-log-httpd\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953241 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953495 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s889\" (UniqueName: \"kubernetes.io/projected/52593735-5960-49ec-b45b-ab63715c1ed9-kube-api-access-2s889\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953681 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-run-httpd\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953742 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953769 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-config-data\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953801 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:17 crc kubenswrapper[4956]: I0314 09:36:17.953849 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-scripts\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.055825 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-run-httpd\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.055899 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.055926 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-config-data\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.055949 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.055982 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-scripts\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.056079 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-log-httpd\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.056101 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.056133 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s889\" (UniqueName: \"kubernetes.io/projected/52593735-5960-49ec-b45b-ab63715c1ed9-kube-api-access-2s889\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.056421 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-run-httpd\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.057789 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-log-httpd\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.061347 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.062118 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-scripts\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.063751 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.064254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.065026 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-config-data\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.075440 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s889\" (UniqueName: \"kubernetes.io/projected/52593735-5960-49ec-b45b-ab63715c1ed9-kube-api-access-2s889\") pod \"ceilometer-0\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: E0314 09:36:18.472913 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:36:18 crc kubenswrapper[4956]: E0314 09:36:18.474617 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:36:18 crc kubenswrapper[4956]: E0314 09:36:18.475919 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:36:18 crc kubenswrapper[4956]: E0314 09:36:18.475970 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" containerName="watcher-applier" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.743594 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.755107 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.771293 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-scripts\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.771350 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-ceilometer-tls-certs\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.771374 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-log-httpd\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.771403 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s889\" (UniqueName: \"kubernetes.io/projected/52593735-5960-49ec-b45b-ab63715c1ed9-kube-api-access-2s889\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.771833 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.776576 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.776857 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-scripts" (OuterVolumeSpecName: "scripts") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:18 crc kubenswrapper[4956]: I0314 09:36:18.777365 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52593735-5960-49ec-b45b-ab63715c1ed9-kube-api-access-2s889" (OuterVolumeSpecName: "kube-api-access-2s889") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "kube-api-access-2s889". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.873031 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-combined-ca-bundle\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.873410 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-sg-core-conf-yaml\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.873468 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-run-httpd\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.873570 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-config-data\") pod \"52593735-5960-49ec-b45b-ab63715c1ed9\" (UID: \"52593735-5960-49ec-b45b-ab63715c1ed9\") " Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.873988 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.874458 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.874503 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.874524 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.874539 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s889\" (UniqueName: \"kubernetes.io/projected/52593735-5960-49ec-b45b-ab63715c1ed9-kube-api-access-2s889\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.874550 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52593735-5960-49ec-b45b-ab63715c1ed9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.878251 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-config-data" (OuterVolumeSpecName: "config-data") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.879289 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.889666 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52593735-5960-49ec-b45b-ab63715c1ed9" (UID: "52593735-5960-49ec-b45b-ab63715c1ed9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.976170 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.976208 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:18.976219 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52593735-5960-49ec-b45b-ab63715c1ed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.036696 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.178759 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4007546c-563d-466f-a611-5b440804d94d-operator-scripts\") pod \"4007546c-563d-466f-a611-5b440804d94d\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.179012 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r587\" (UniqueName: \"kubernetes.io/projected/4007546c-563d-466f-a611-5b440804d94d-kube-api-access-4r587\") pod \"4007546c-563d-466f-a611-5b440804d94d\" (UID: \"4007546c-563d-466f-a611-5b440804d94d\") " Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.179651 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4007546c-563d-466f-a611-5b440804d94d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4007546c-563d-466f-a611-5b440804d94d" (UID: "4007546c-563d-466f-a611-5b440804d94d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.182052 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4007546c-563d-466f-a611-5b440804d94d-kube-api-access-4r587" (OuterVolumeSpecName: "kube-api-access-4r587") pod "4007546c-563d-466f-a611-5b440804d94d" (UID: "4007546c-563d-466f-a611-5b440804d94d"). InnerVolumeSpecName "kube-api-access-4r587". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.223907 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05536f1a-0f62-4ddb-93a0-85429c9676ee" path="/var/lib/kubelet/pods/05536f1a-0f62-4ddb-93a0-85429c9676ee/volumes" Mar 14 09:36:19 crc kubenswrapper[4956]: E0314 09:36:19.281094 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:19 crc kubenswrapper[4956]: E0314 09:36:19.281318 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data podName:f1dc470d-c357-4a37-96ef-bfaf42513d47 nodeName:}" failed. No retries permitted until 2026-03-14 09:36:23.281230031 +0000 UTC m=+2388.793922299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.281811 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r587\" (UniqueName: \"kubernetes.io/projected/4007546c-563d-466f-a611-5b440804d94d-kube-api-access-4r587\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.281830 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4007546c-563d-466f-a611-5b440804d94d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.755451 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.755444 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" event={"ID":"4007546c-563d-466f-a611-5b440804d94d","Type":"ContainerDied","Data":"a40def40ec17c383bb9c9363db4c757c8323c5d0aa6706bc5bbeb89819f2dc83"} Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.756387 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a40def40ec17c383bb9c9363db4c757c8323c5d0aa6706bc5bbeb89819f2dc83" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.755538 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher5cb7-account-delete-kdql5" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.814509 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.833802 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.842555 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:19 crc kubenswrapper[4956]: E0314 09:36:19.842983 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4007546c-563d-466f-a611-5b440804d94d" containerName="mariadb-account-delete" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.842999 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4007546c-563d-466f-a611-5b440804d94d" containerName="mariadb-account-delete" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.843189 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4007546c-563d-466f-a611-5b440804d94d" containerName="mariadb-account-delete" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.844953 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.848604 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.849100 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.849372 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.852048 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.992430 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.992806 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.992900 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-config-data\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.992975 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-run-httpd\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.993264 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-log-httpd\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.993297 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.993325 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-scripts\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:19 crc kubenswrapper[4956]: I0314 09:36:19.993511 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpb2\" (UniqueName: \"kubernetes.io/projected/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-kube-api-access-7cpb2\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094622 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094681 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-config-data\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094706 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-run-httpd\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094755 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-log-httpd\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094779 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094800 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-scripts\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094843 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpb2\" (UniqueName: \"kubernetes.io/projected/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-kube-api-access-7cpb2\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.094877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.096027 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-run-httpd\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.096032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-log-httpd\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.099185 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.099798 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-config-data\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.100454 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.102201 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.105178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-scripts\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.124627 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpb2\" (UniqueName: \"kubernetes.io/projected/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-kube-api-access-7cpb2\") pod \"ceilometer-0\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.164026 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.315612 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fv2n7"] Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.323613 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-fv2n7"] Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.330316 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher5cb7-account-delete-kdql5"] Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.338317 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher5cb7-account-delete-kdql5"] Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.352722 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-5cb7-account-create-update-972hv"] Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.365086 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-5cb7-account-create-update-972hv"] Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.539567 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:20 crc kubenswrapper[4956]: W0314 09:36:20.549885 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e1bfd92_a9b1_4fe9_8dc3_ef4736c59942.slice/crio-573c7ba72179109b8b55513f683a8d1062cd6f53d8ca7a5865be18ec3763dedc WatchSource:0}: Error finding container 573c7ba72179109b8b55513f683a8d1062cd6f53d8ca7a5865be18ec3763dedc: Status 404 returned error can't find the container with id 573c7ba72179109b8b55513f683a8d1062cd6f53d8ca7a5865be18ec3763dedc Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.747206 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.764352 4956 generic.go:334] "Generic (PLEG): container finished" podID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" exitCode=0 Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.764426 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d4825c86-0535-4bfb-b0fd-cd5a69771dcf","Type":"ContainerDied","Data":"286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276"} Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.764462 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"d4825c86-0535-4bfb-b0fd-cd5a69771dcf","Type":"ContainerDied","Data":"cb2f655d984e4ecc8555f355d6ebb464e6615afa1e3e0e67ccd17fb9efa775ef"} Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.764498 4956 scope.go:117] "RemoveContainer" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.764627 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.766906 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerStarted","Data":"573c7ba72179109b8b55513f683a8d1062cd6f53d8ca7a5865be18ec3763dedc"} Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.810636 4956 scope.go:117] "RemoveContainer" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" Mar 14 09:36:20 crc kubenswrapper[4956]: E0314 09:36:20.814613 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276\": container with ID starting with 286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276 not found: ID does not exist" containerID="286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.814659 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276"} err="failed to get container status \"286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276\": rpc error: code = NotFound desc = could not find container \"286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276\": container with ID starting with 286b4a0235e8695f05198043805158949c2865105c2620e30250b1a619177276 not found: ID does not exist" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.908176 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-logs\") pod \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.908318 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-combined-ca-bundle\") pod \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.908353 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-config-data\") pod \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.908449 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-kube-api-access-x47f8\") pod \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\" (UID: \"d4825c86-0535-4bfb-b0fd-cd5a69771dcf\") " Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.908707 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-logs" (OuterVolumeSpecName: "logs") pod "d4825c86-0535-4bfb-b0fd-cd5a69771dcf" (UID: "d4825c86-0535-4bfb-b0fd-cd5a69771dcf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.908792 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.914041 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-kube-api-access-x47f8" (OuterVolumeSpecName: "kube-api-access-x47f8") pod "d4825c86-0535-4bfb-b0fd-cd5a69771dcf" (UID: "d4825c86-0535-4bfb-b0fd-cd5a69771dcf"). InnerVolumeSpecName "kube-api-access-x47f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.930920 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4825c86-0535-4bfb-b0fd-cd5a69771dcf" (UID: "d4825c86-0535-4bfb-b0fd-cd5a69771dcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:20 crc kubenswrapper[4956]: I0314 09:36:20.951974 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-config-data" (OuterVolumeSpecName: "config-data") pod "d4825c86-0535-4bfb-b0fd-cd5a69771dcf" (UID: "d4825c86-0535-4bfb-b0fd-cd5a69771dcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.010461 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-kube-api-access-x47f8\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.010516 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.010530 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4825c86-0535-4bfb-b0fd-cd5a69771dcf-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.104777 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.114405 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.237637 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2" path="/var/lib/kubelet/pods/26d3b3e8-bb6b-42b9-8d28-29a1e5e8d3c2/volumes" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.242311 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4007546c-563d-466f-a611-5b440804d94d" path="/var/lib/kubelet/pods/4007546c-563d-466f-a611-5b440804d94d/volumes" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.243308 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52593735-5960-49ec-b45b-ab63715c1ed9" path="/var/lib/kubelet/pods/52593735-5960-49ec-b45b-ab63715c1ed9/volumes" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.246030 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a703874a-c31f-4e92-8b4a-2f3e28d31666" path="/var/lib/kubelet/pods/a703874a-c31f-4e92-8b4a-2f3e28d31666/volumes" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.251310 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" path="/var/lib/kubelet/pods/d4825c86-0535-4bfb-b0fd-cd5a69771dcf/volumes" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.592258 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.628664 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-combined-ca-bundle\") pod \"f1dc470d-c357-4a37-96ef-bfaf42513d47\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.628731 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p4g7\" (UniqueName: \"kubernetes.io/projected/f1dc470d-c357-4a37-96ef-bfaf42513d47-kube-api-access-8p4g7\") pod \"f1dc470d-c357-4a37-96ef-bfaf42513d47\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.628769 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-custom-prometheus-ca\") pod \"f1dc470d-c357-4a37-96ef-bfaf42513d47\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.628811 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1dc470d-c357-4a37-96ef-bfaf42513d47-logs\") pod \"f1dc470d-c357-4a37-96ef-bfaf42513d47\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.628914 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data\") pod \"f1dc470d-c357-4a37-96ef-bfaf42513d47\" (UID: \"f1dc470d-c357-4a37-96ef-bfaf42513d47\") " Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.629242 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1dc470d-c357-4a37-96ef-bfaf42513d47-logs" (OuterVolumeSpecName: "logs") pod "f1dc470d-c357-4a37-96ef-bfaf42513d47" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.629581 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1dc470d-c357-4a37-96ef-bfaf42513d47-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.633692 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1dc470d-c357-4a37-96ef-bfaf42513d47-kube-api-access-8p4g7" (OuterVolumeSpecName: "kube-api-access-8p4g7") pod "f1dc470d-c357-4a37-96ef-bfaf42513d47" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47"). InnerVolumeSpecName "kube-api-access-8p4g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.655296 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1dc470d-c357-4a37-96ef-bfaf42513d47" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.660884 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f1dc470d-c357-4a37-96ef-bfaf42513d47" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.698805 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data" (OuterVolumeSpecName: "config-data") pod "f1dc470d-c357-4a37-96ef-bfaf42513d47" (UID: "f1dc470d-c357-4a37-96ef-bfaf42513d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.730646 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.730746 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.730759 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1dc470d-c357-4a37-96ef-bfaf42513d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.730768 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p4g7\" (UniqueName: \"kubernetes.io/projected/f1dc470d-c357-4a37-96ef-bfaf42513d47-kube-api-access-8p4g7\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.776069 4956 generic.go:334] "Generic (PLEG): container finished" podID="f1dc470d-c357-4a37-96ef-bfaf42513d47" containerID="994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52" exitCode=0 Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.776107 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.776138 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f1dc470d-c357-4a37-96ef-bfaf42513d47","Type":"ContainerDied","Data":"994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52"} Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.776171 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f1dc470d-c357-4a37-96ef-bfaf42513d47","Type":"ContainerDied","Data":"8c23dd28b717ca06c46ae4db94f14f56325349f598bf14f371652fac9bb95622"} Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.776205 4956 scope.go:117] "RemoveContainer" containerID="994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.779911 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerStarted","Data":"ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865"} Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.802324 4956 scope.go:117] "RemoveContainer" containerID="994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52" Mar 14 09:36:21 crc kubenswrapper[4956]: E0314 09:36:21.803338 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52\": container with ID starting with 994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52 not found: ID does not exist" containerID="994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.803379 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52"} err="failed to get container status \"994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52\": rpc error: code = NotFound desc = could not find container \"994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52\": container with ID starting with 994f37131d5fd390f2de0d17154667f7ff3236921f6d010fff083fbaec0a0f52 not found: ID does not exist" Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.823700 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:21 crc kubenswrapper[4956]: I0314 09:36:21.829288 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:22 crc kubenswrapper[4956]: I0314 09:36:22.791455 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerStarted","Data":"4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed"} Mar 14 09:36:22 crc kubenswrapper[4956]: I0314 09:36:22.791949 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerStarted","Data":"ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f"} Mar 14 09:36:23 crc kubenswrapper[4956]: I0314 09:36:23.220128 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1dc470d-c357-4a37-96ef-bfaf42513d47" path="/var/lib/kubelet/pods/f1dc470d-c357-4a37-96ef-bfaf42513d47/volumes" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.199339 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-zzrgx"] Mar 14 09:36:25 crc kubenswrapper[4956]: E0314 09:36:25.201735 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" containerName="watcher-applier" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.202134 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" containerName="watcher-applier" Mar 14 09:36:25 crc kubenswrapper[4956]: E0314 09:36:25.202164 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1dc470d-c357-4a37-96ef-bfaf42513d47" containerName="watcher-decision-engine" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.202172 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dc470d-c357-4a37-96ef-bfaf42513d47" containerName="watcher-decision-engine" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.204017 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1dc470d-c357-4a37-96ef-bfaf42513d47" containerName="watcher-decision-engine" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.204051 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4825c86-0535-4bfb-b0fd-cd5a69771dcf" containerName="watcher-applier" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.207774 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.252117 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zzrgx"] Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.296290 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7t72\" (UniqueName: \"kubernetes.io/projected/afa07278-73b8-4673-98ed-0e617c7426f9-kube-api-access-p7t72\") pod \"watcher-db-create-zzrgx\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.296624 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa07278-73b8-4673-98ed-0e617c7426f9-operator-scripts\") pod \"watcher-db-create-zzrgx\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.324550 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs"] Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.325621 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.330367 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.338093 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs"] Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.398935 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52jx6\" (UniqueName: \"kubernetes.io/projected/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-kube-api-access-52jx6\") pod \"watcher-ed8a-account-create-update-rxbjs\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.399015 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa07278-73b8-4673-98ed-0e617c7426f9-operator-scripts\") pod \"watcher-db-create-zzrgx\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.399040 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-operator-scripts\") pod \"watcher-ed8a-account-create-update-rxbjs\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.399154 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7t72\" (UniqueName: \"kubernetes.io/projected/afa07278-73b8-4673-98ed-0e617c7426f9-kube-api-access-p7t72\") pod \"watcher-db-create-zzrgx\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.399793 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa07278-73b8-4673-98ed-0e617c7426f9-operator-scripts\") pod \"watcher-db-create-zzrgx\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.420284 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7t72\" (UniqueName: \"kubernetes.io/projected/afa07278-73b8-4673-98ed-0e617c7426f9-kube-api-access-p7t72\") pod \"watcher-db-create-zzrgx\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.501523 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52jx6\" (UniqueName: \"kubernetes.io/projected/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-kube-api-access-52jx6\") pod \"watcher-ed8a-account-create-update-rxbjs\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.501901 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-operator-scripts\") pod \"watcher-ed8a-account-create-update-rxbjs\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.502661 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-operator-scripts\") pod \"watcher-ed8a-account-create-update-rxbjs\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.525699 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52jx6\" (UniqueName: \"kubernetes.io/projected/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-kube-api-access-52jx6\") pod \"watcher-ed8a-account-create-update-rxbjs\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.588084 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.647009 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.832014 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerStarted","Data":"30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e"} Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.835135 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:25 crc kubenswrapper[4956]: I0314 09:36:25.874605 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.368581888 podStartE2EDuration="6.874584543s" podCreationTimestamp="2026-03-14 09:36:19 +0000 UTC" firstStartedPulling="2026-03-14 09:36:20.554662736 +0000 UTC m=+2386.067355004" lastFinishedPulling="2026-03-14 09:36:25.060665391 +0000 UTC m=+2390.573357659" observedRunningTime="2026-03-14 09:36:25.867894003 +0000 UTC m=+2391.380586271" watchObservedRunningTime="2026-03-14 09:36:25.874584543 +0000 UTC m=+2391.387276801" Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.053623 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zzrgx"] Mar 14 09:36:26 crc kubenswrapper[4956]: W0314 09:36:26.063132 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafa07278_73b8_4673_98ed_0e617c7426f9.slice/crio-281b618eeec7e83dee33b68f7ed63ed19cc4549e6d22a6bded8fbecc825b2eb1 WatchSource:0}: Error finding container 281b618eeec7e83dee33b68f7ed63ed19cc4549e6d22a6bded8fbecc825b2eb1: Status 404 returned error can't find the container with id 281b618eeec7e83dee33b68f7ed63ed19cc4549e6d22a6bded8fbecc825b2eb1 Mar 14 09:36:26 crc kubenswrapper[4956]: W0314 09:36:26.204768 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e7f711_f5e2_4517_bdb5_6c9a3eca6c82.slice/crio-0081f62d299001b104109d396972a4961282012c340716491ecf302423811333 WatchSource:0}: Error finding container 0081f62d299001b104109d396972a4961282012c340716491ecf302423811333: Status 404 returned error can't find the container with id 0081f62d299001b104109d396972a4961282012c340716491ecf302423811333 Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.207779 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs"] Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.841582 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zzrgx" event={"ID":"afa07278-73b8-4673-98ed-0e617c7426f9","Type":"ContainerStarted","Data":"6dc77a797fb23dd64c5e353052ea68ad0723873ef34ecff129286644d01910db"} Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.841634 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zzrgx" event={"ID":"afa07278-73b8-4673-98ed-0e617c7426f9","Type":"ContainerStarted","Data":"281b618eeec7e83dee33b68f7ed63ed19cc4549e6d22a6bded8fbecc825b2eb1"} Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.843652 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" event={"ID":"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82","Type":"ContainerStarted","Data":"defe8aa91f6ee237c406e15df839c5268b357638d3922dc07b09178bffd879ae"} Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.843697 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" event={"ID":"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82","Type":"ContainerStarted","Data":"0081f62d299001b104109d396972a4961282012c340716491ecf302423811333"} Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.867073 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-zzrgx" podStartSLOduration=1.867055538 podStartE2EDuration="1.867055538s" podCreationTimestamp="2026-03-14 09:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:26.865431627 +0000 UTC m=+2392.378123895" watchObservedRunningTime="2026-03-14 09:36:26.867055538 +0000 UTC m=+2392.379747806" Mar 14 09:36:26 crc kubenswrapper[4956]: I0314 09:36:26.893113 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" podStartSLOduration=1.893096521 podStartE2EDuration="1.893096521s" podCreationTimestamp="2026-03-14 09:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:26.890091594 +0000 UTC m=+2392.402783862" watchObservedRunningTime="2026-03-14 09:36:26.893096521 +0000 UTC m=+2392.405788789" Mar 14 09:36:27 crc kubenswrapper[4956]: I0314 09:36:27.209585 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:36:27 crc kubenswrapper[4956]: E0314 09:36:27.209832 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:36:27 crc kubenswrapper[4956]: I0314 09:36:27.860565 4956 generic.go:334] "Generic (PLEG): container finished" podID="21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" containerID="defe8aa91f6ee237c406e15df839c5268b357638d3922dc07b09178bffd879ae" exitCode=0 Mar 14 09:36:27 crc kubenswrapper[4956]: I0314 09:36:27.860658 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" event={"ID":"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82","Type":"ContainerDied","Data":"defe8aa91f6ee237c406e15df839c5268b357638d3922dc07b09178bffd879ae"} Mar 14 09:36:27 crc kubenswrapper[4956]: I0314 09:36:27.864574 4956 generic.go:334] "Generic (PLEG): container finished" podID="afa07278-73b8-4673-98ed-0e617c7426f9" containerID="6dc77a797fb23dd64c5e353052ea68ad0723873ef34ecff129286644d01910db" exitCode=0 Mar 14 09:36:27 crc kubenswrapper[4956]: I0314 09:36:27.864681 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zzrgx" event={"ID":"afa07278-73b8-4673-98ed-0e617c7426f9","Type":"ContainerDied","Data":"6dc77a797fb23dd64c5e353052ea68ad0723873ef34ecff129286644d01910db"} Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.281983 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.288355 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.471386 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa07278-73b8-4673-98ed-0e617c7426f9-operator-scripts\") pod \"afa07278-73b8-4673-98ed-0e617c7426f9\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.471453 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7t72\" (UniqueName: \"kubernetes.io/projected/afa07278-73b8-4673-98ed-0e617c7426f9-kube-api-access-p7t72\") pod \"afa07278-73b8-4673-98ed-0e617c7426f9\" (UID: \"afa07278-73b8-4673-98ed-0e617c7426f9\") " Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.471580 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52jx6\" (UniqueName: \"kubernetes.io/projected/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-kube-api-access-52jx6\") pod \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.471970 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa07278-73b8-4673-98ed-0e617c7426f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afa07278-73b8-4673-98ed-0e617c7426f9" (UID: "afa07278-73b8-4673-98ed-0e617c7426f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.472450 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-operator-scripts\") pod \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\" (UID: \"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82\") " Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.472744 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" (UID: "21e7f711-f5e2-4517-bdb5-6c9a3eca6c82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.472830 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.472849 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa07278-73b8-4673-98ed-0e617c7426f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.482639 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-kube-api-access-52jx6" (OuterVolumeSpecName: "kube-api-access-52jx6") pod "21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" (UID: "21e7f711-f5e2-4517-bdb5-6c9a3eca6c82"). InnerVolumeSpecName "kube-api-access-52jx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.484569 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa07278-73b8-4673-98ed-0e617c7426f9-kube-api-access-p7t72" (OuterVolumeSpecName: "kube-api-access-p7t72") pod "afa07278-73b8-4673-98ed-0e617c7426f9" (UID: "afa07278-73b8-4673-98ed-0e617c7426f9"). InnerVolumeSpecName "kube-api-access-p7t72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.573866 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7t72\" (UniqueName: \"kubernetes.io/projected/afa07278-73b8-4673-98ed-0e617c7426f9-kube-api-access-p7t72\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.573901 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52jx6\" (UniqueName: \"kubernetes.io/projected/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82-kube-api-access-52jx6\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.881647 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" event={"ID":"21e7f711-f5e2-4517-bdb5-6c9a3eca6c82","Type":"ContainerDied","Data":"0081f62d299001b104109d396972a4961282012c340716491ecf302423811333"} Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.881714 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0081f62d299001b104109d396972a4961282012c340716491ecf302423811333" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.881677 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.882876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zzrgx" event={"ID":"afa07278-73b8-4673-98ed-0e617c7426f9","Type":"ContainerDied","Data":"281b618eeec7e83dee33b68f7ed63ed19cc4549e6d22a6bded8fbecc825b2eb1"} Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.882905 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="281b618eeec7e83dee33b68f7ed63ed19cc4549e6d22a6bded8fbecc825b2eb1" Mar 14 09:36:29 crc kubenswrapper[4956]: I0314 09:36:29.882958 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zzrgx" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.609921 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd"] Mar 14 09:36:30 crc kubenswrapper[4956]: E0314 09:36:30.610559 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa07278-73b8-4673-98ed-0e617c7426f9" containerName="mariadb-database-create" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.610574 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa07278-73b8-4673-98ed-0e617c7426f9" containerName="mariadb-database-create" Mar 14 09:36:30 crc kubenswrapper[4956]: E0314 09:36:30.610606 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" containerName="mariadb-account-create-update" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.610615 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" containerName="mariadb-account-create-update" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.610795 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" containerName="mariadb-account-create-update" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.610821 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa07278-73b8-4673-98ed-0e617c7426f9" containerName="mariadb-database-create" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.611469 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.613243 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-tlsdz" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.613326 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.621066 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd"] Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.792255 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.792306 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-db-sync-config-data\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.792356 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-config-data\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.792399 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5krfc\" (UniqueName: \"kubernetes.io/projected/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-kube-api-access-5krfc\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.893613 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-config-data\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.893708 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5krfc\" (UniqueName: \"kubernetes.io/projected/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-kube-api-access-5krfc\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.893874 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.893939 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-db-sync-config-data\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.898556 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.899907 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-config-data\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.901336 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-db-sync-config-data\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.916219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5krfc\" (UniqueName: \"kubernetes.io/projected/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-kube-api-access-5krfc\") pod \"watcher-kuttl-db-sync-bjgfd\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:30 crc kubenswrapper[4956]: I0314 09:36:30.930106 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:31 crc kubenswrapper[4956]: I0314 09:36:31.399459 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd"] Mar 14 09:36:31 crc kubenswrapper[4956]: W0314 09:36:31.423556 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53baaaa_3c9e_4e73_a8ca_8ab58635e119.slice/crio-0fe84b4b581494486935a881164f9d8333203731e8846aaab38de81bed8f895a WatchSource:0}: Error finding container 0fe84b4b581494486935a881164f9d8333203731e8846aaab38de81bed8f895a: Status 404 returned error can't find the container with id 0fe84b4b581494486935a881164f9d8333203731e8846aaab38de81bed8f895a Mar 14 09:36:31 crc kubenswrapper[4956]: I0314 09:36:31.901528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" event={"ID":"f53baaaa-3c9e-4e73-a8ca-8ab58635e119","Type":"ContainerStarted","Data":"97ba76b7f590e48964927a1c93ac0cc7d2147eeca7cd5c6a69dc38cdffe3b900"} Mar 14 09:36:31 crc kubenswrapper[4956]: I0314 09:36:31.901846 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" event={"ID":"f53baaaa-3c9e-4e73-a8ca-8ab58635e119","Type":"ContainerStarted","Data":"0fe84b4b581494486935a881164f9d8333203731e8846aaab38de81bed8f895a"} Mar 14 09:36:31 crc kubenswrapper[4956]: I0314 09:36:31.919650 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" podStartSLOduration=1.919628152 podStartE2EDuration="1.919628152s" podCreationTimestamp="2026-03-14 09:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:31.916782459 +0000 UTC m=+2397.429474728" watchObservedRunningTime="2026-03-14 09:36:31.919628152 +0000 UTC m=+2397.432320430" Mar 14 09:36:33 crc kubenswrapper[4956]: I0314 09:36:33.916320 4956 generic.go:334] "Generic (PLEG): container finished" podID="f53baaaa-3c9e-4e73-a8ca-8ab58635e119" containerID="97ba76b7f590e48964927a1c93ac0cc7d2147eeca7cd5c6a69dc38cdffe3b900" exitCode=0 Mar 14 09:36:33 crc kubenswrapper[4956]: I0314 09:36:33.916416 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" event={"ID":"f53baaaa-3c9e-4e73-a8ca-8ab58635e119","Type":"ContainerDied","Data":"97ba76b7f590e48964927a1c93ac0cc7d2147eeca7cd5c6a69dc38cdffe3b900"} Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.221781 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.362816 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5krfc\" (UniqueName: \"kubernetes.io/projected/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-kube-api-access-5krfc\") pod \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.362950 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-db-sync-config-data\") pod \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.362974 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-combined-ca-bundle\") pod \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.363069 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-config-data\") pod \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\" (UID: \"f53baaaa-3c9e-4e73-a8ca-8ab58635e119\") " Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.368305 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f53baaaa-3c9e-4e73-a8ca-8ab58635e119" (UID: "f53baaaa-3c9e-4e73-a8ca-8ab58635e119"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.370315 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-kube-api-access-5krfc" (OuterVolumeSpecName: "kube-api-access-5krfc") pod "f53baaaa-3c9e-4e73-a8ca-8ab58635e119" (UID: "f53baaaa-3c9e-4e73-a8ca-8ab58635e119"). InnerVolumeSpecName "kube-api-access-5krfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.386022 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f53baaaa-3c9e-4e73-a8ca-8ab58635e119" (UID: "f53baaaa-3c9e-4e73-a8ca-8ab58635e119"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.403703 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-config-data" (OuterVolumeSpecName: "config-data") pod "f53baaaa-3c9e-4e73-a8ca-8ab58635e119" (UID: "f53baaaa-3c9e-4e73-a8ca-8ab58635e119"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.465480 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.465538 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5krfc\" (UniqueName: \"kubernetes.io/projected/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-kube-api-access-5krfc\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.465554 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.465566 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53baaaa-3c9e-4e73-a8ca-8ab58635e119-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.937164 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" event={"ID":"f53baaaa-3c9e-4e73-a8ca-8ab58635e119","Type":"ContainerDied","Data":"0fe84b4b581494486935a881164f9d8333203731e8846aaab38de81bed8f895a"} Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.937202 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fe84b4b581494486935a881164f9d8333203731e8846aaab38de81bed8f895a" Mar 14 09:36:35 crc kubenswrapper[4956]: I0314 09:36:35.937240 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.266488 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:36 crc kubenswrapper[4956]: E0314 09:36:36.267366 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53baaaa-3c9e-4e73-a8ca-8ab58635e119" containerName="watcher-kuttl-db-sync" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.267384 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53baaaa-3c9e-4e73-a8ca-8ab58635e119" containerName="watcher-kuttl-db-sync" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.267627 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53baaaa-3c9e-4e73-a8ca-8ab58635e119" containerName="watcher-kuttl-db-sync" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.268745 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.275335 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-tlsdz" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.275694 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.275746 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.284970 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285176 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285257 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285291 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285412 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285453 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xvd\" (UniqueName: \"kubernetes.io/projected/606d56d8-87ca-48af-8adb-b22184238e89-kube-api-access-d5xvd\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285519 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606d56d8-87ca-48af-8adb-b22184238e89-logs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.285555 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.289850 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.330670 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.332025 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.333885 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.351741 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.388195 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.388734 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcpwk\" (UniqueName: \"kubernetes.io/projected/c68d3098-9fdb-434d-a7d7-796a245cba1c-kube-api-access-rcpwk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.388828 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.388912 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68d3098-9fdb-434d-a7d7-796a245cba1c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.388997 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389062 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389133 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389205 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389270 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xvd\" (UniqueName: \"kubernetes.io/projected/606d56d8-87ca-48af-8adb-b22184238e89-kube-api-access-d5xvd\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389337 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606d56d8-87ca-48af-8adb-b22184238e89-logs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389416 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.389684 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.390901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606d56d8-87ca-48af-8adb-b22184238e89-logs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.391783 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.393412 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.396477 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.397982 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.399331 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.404885 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.405987 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.407421 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.409061 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.423117 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xvd\" (UniqueName: \"kubernetes.io/projected/606d56d8-87ca-48af-8adb-b22184238e89-kube-api-access-d5xvd\") pod \"watcher-kuttl-api-0\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490504 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973f66e6-bf42-4786-9330-65c6ca115a9c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490604 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490634 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490674 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcpwk\" (UniqueName: \"kubernetes.io/projected/c68d3098-9fdb-434d-a7d7-796a245cba1c-kube-api-access-rcpwk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490751 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490791 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68d3098-9fdb-434d-a7d7-796a245cba1c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490828 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c29n\" (UniqueName: \"kubernetes.io/projected/973f66e6-bf42-4786-9330-65c6ca115a9c-kube-api-access-6c29n\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.490857 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.491915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68d3098-9fdb-434d-a7d7-796a245cba1c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.493568 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.494458 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.494631 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.514412 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcpwk\" (UniqueName: \"kubernetes.io/projected/c68d3098-9fdb-434d-a7d7-796a245cba1c-kube-api-access-rcpwk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.583197 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.592097 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.592718 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.592769 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c29n\" (UniqueName: \"kubernetes.io/projected/973f66e6-bf42-4786-9330-65c6ca115a9c-kube-api-access-6c29n\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.592833 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973f66e6-bf42-4786-9330-65c6ca115a9c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.593175 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973f66e6-bf42-4786-9330-65c6ca115a9c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.595627 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.596250 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.615247 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c29n\" (UniqueName: \"kubernetes.io/projected/973f66e6-bf42-4786-9330-65c6ca115a9c-kube-api-access-6c29n\") pod \"watcher-kuttl-applier-0\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.645550 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:36 crc kubenswrapper[4956]: I0314 09:36:36.794044 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.015275 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.072615 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:37 crc kubenswrapper[4956]: W0314 09:36:37.075845 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc68d3098_9fdb_434d_a7d7_796a245cba1c.slice/crio-b3e08feea66ccc347f80e9f9f3a0f1e0e947e412e5727c2d6f9666536d6fd559 WatchSource:0}: Error finding container b3e08feea66ccc347f80e9f9f3a0f1e0e947e412e5727c2d6f9666536d6fd559: Status 404 returned error can't find the container with id b3e08feea66ccc347f80e9f9f3a0f1e0e947e412e5727c2d6f9666536d6fd559 Mar 14 09:36:37 crc kubenswrapper[4956]: W0314 09:36:37.225012 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod973f66e6_bf42_4786_9330_65c6ca115a9c.slice/crio-3971f5cdbedccafc4767af72e3501c4d87e5f6eb33dbdd09e1cb2df2af891f3b WatchSource:0}: Error finding container 3971f5cdbedccafc4767af72e3501c4d87e5f6eb33dbdd09e1cb2df2af891f3b: Status 404 returned error can't find the container with id 3971f5cdbedccafc4767af72e3501c4d87e5f6eb33dbdd09e1cb2df2af891f3b Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.227616 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.957713 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"c68d3098-9fdb-434d-a7d7-796a245cba1c","Type":"ContainerStarted","Data":"f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.958053 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"c68d3098-9fdb-434d-a7d7-796a245cba1c","Type":"ContainerStarted","Data":"b3e08feea66ccc347f80e9f9f3a0f1e0e947e412e5727c2d6f9666536d6fd559"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.959705 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"606d56d8-87ca-48af-8adb-b22184238e89","Type":"ContainerStarted","Data":"2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.959746 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"606d56d8-87ca-48af-8adb-b22184238e89","Type":"ContainerStarted","Data":"689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.959772 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"606d56d8-87ca-48af-8adb-b22184238e89","Type":"ContainerStarted","Data":"b8110c510a862184f8954ef04cfd9e8fa5cf124f78fe95d49b7ca298faf93821"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.960064 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.961117 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"973f66e6-bf42-4786-9330-65c6ca115a9c","Type":"ContainerStarted","Data":"71f24bdb3d02e8026ee0f519394df1c00564295e7cf5fc5120141adaff4dbe71"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.961155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"973f66e6-bf42-4786-9330-65c6ca115a9c","Type":"ContainerStarted","Data":"3971f5cdbedccafc4767af72e3501c4d87e5f6eb33dbdd09e1cb2df2af891f3b"} Mar 14 09:36:37 crc kubenswrapper[4956]: I0314 09:36:37.987975 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.987945083 podStartE2EDuration="1.987945083s" podCreationTimestamp="2026-03-14 09:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:37.984190747 +0000 UTC m=+2403.496883045" watchObservedRunningTime="2026-03-14 09:36:37.987945083 +0000 UTC m=+2403.500637351" Mar 14 09:36:38 crc kubenswrapper[4956]: I0314 09:36:38.024345 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.024321978 podStartE2EDuration="2.024321978s" podCreationTimestamp="2026-03-14 09:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:38.011281817 +0000 UTC m=+2403.523974105" watchObservedRunningTime="2026-03-14 09:36:38.024321978 +0000 UTC m=+2403.537014246" Mar 14 09:36:38 crc kubenswrapper[4956]: I0314 09:36:38.035721 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.035706058 podStartE2EDuration="2.035706058s" podCreationTimestamp="2026-03-14 09:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:38.032307202 +0000 UTC m=+2403.544999490" watchObservedRunningTime="2026-03-14 09:36:38.035706058 +0000 UTC m=+2403.548398326" Mar 14 09:36:39 crc kubenswrapper[4956]: I0314 09:36:39.209820 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:36:39 crc kubenswrapper[4956]: E0314 09:36:39.210611 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:36:40 crc kubenswrapper[4956]: I0314 09:36:40.220635 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:41 crc kubenswrapper[4956]: I0314 09:36:41.584336 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:41 crc kubenswrapper[4956]: I0314 09:36:41.794727 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:46 crc kubenswrapper[4956]: I0314 09:36:46.583662 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:46 crc kubenswrapper[4956]: I0314 09:36:46.591313 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:46 crc kubenswrapper[4956]: I0314 09:36:46.646534 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:46 crc kubenswrapper[4956]: I0314 09:36:46.676259 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:46 crc kubenswrapper[4956]: I0314 09:36:46.794661 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:46 crc kubenswrapper[4956]: I0314 09:36:46.825530 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:47 crc kubenswrapper[4956]: I0314 09:36:47.037904 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:47 crc kubenswrapper[4956]: I0314 09:36:47.048024 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:47 crc kubenswrapper[4956]: I0314 09:36:47.060153 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:47 crc kubenswrapper[4956]: I0314 09:36:47.064977 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:48 crc kubenswrapper[4956]: I0314 09:36:48.851049 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:48 crc kubenswrapper[4956]: I0314 09:36:48.851689 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-central-agent" containerID="cri-o://ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865" gracePeriod=30 Mar 14 09:36:48 crc kubenswrapper[4956]: I0314 09:36:48.852458 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="sg-core" containerID="cri-o://4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed" gracePeriod=30 Mar 14 09:36:48 crc kubenswrapper[4956]: I0314 09:36:48.852455 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="proxy-httpd" containerID="cri-o://30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e" gracePeriod=30 Mar 14 09:36:48 crc kubenswrapper[4956]: I0314 09:36:48.852455 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-notification-agent" containerID="cri-o://ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f" gracePeriod=30 Mar 14 09:36:48 crc kubenswrapper[4956]: I0314 09:36:48.867790 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.178:3000/\": EOF" Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.067900 4956 generic.go:334] "Generic (PLEG): container finished" podID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerID="30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e" exitCode=0 Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.068145 4956 generic.go:334] "Generic (PLEG): container finished" podID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerID="4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed" exitCode=2 Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.067959 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerDied","Data":"30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e"} Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.068263 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerDied","Data":"4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed"} Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.670227 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.670456 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-kuttl-api-log" containerID="cri-o://689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8" gracePeriod=30 Mar 14 09:36:49 crc kubenswrapper[4956]: I0314 09:36:49.670601 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-api" containerID="cri-o://2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb" gracePeriod=30 Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.076757 4956 generic.go:334] "Generic (PLEG): container finished" podID="606d56d8-87ca-48af-8adb-b22184238e89" containerID="689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8" exitCode=143 Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.076823 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"606d56d8-87ca-48af-8adb-b22184238e89","Type":"ContainerDied","Data":"689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8"} Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.079909 4956 generic.go:334] "Generic (PLEG): container finished" podID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerID="ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865" exitCode=0 Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.079935 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerDied","Data":"ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865"} Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.165120 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.178:3000/\": dial tcp 10.217.0.178:3000: connect: connection refused" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.615839 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722100 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5xvd\" (UniqueName: \"kubernetes.io/projected/606d56d8-87ca-48af-8adb-b22184238e89-kube-api-access-d5xvd\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722192 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-public-tls-certs\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722219 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-combined-ca-bundle\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722240 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606d56d8-87ca-48af-8adb-b22184238e89-logs\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722307 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-custom-prometheus-ca\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722324 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-internal-tls-certs\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722439 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-config-data\") pod \"606d56d8-87ca-48af-8adb-b22184238e89\" (UID: \"606d56d8-87ca-48af-8adb-b22184238e89\") " Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.722646 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606d56d8-87ca-48af-8adb-b22184238e89-logs" (OuterVolumeSpecName: "logs") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.723295 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606d56d8-87ca-48af-8adb-b22184238e89-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.743878 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606d56d8-87ca-48af-8adb-b22184238e89-kube-api-access-d5xvd" (OuterVolumeSpecName: "kube-api-access-d5xvd") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "kube-api-access-d5xvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.752457 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.758781 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.773304 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-config-data" (OuterVolumeSpecName: "config-data") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.773473 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.775777 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "606d56d8-87ca-48af-8adb-b22184238e89" (UID: "606d56d8-87ca-48af-8adb-b22184238e89"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.824542 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.824575 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.824588 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.824596 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.824606 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606d56d8-87ca-48af-8adb-b22184238e89-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:50 crc kubenswrapper[4956]: I0314 09:36:50.824614 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5xvd\" (UniqueName: \"kubernetes.io/projected/606d56d8-87ca-48af-8adb-b22184238e89-kube-api-access-d5xvd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.088758 4956 generic.go:334] "Generic (PLEG): container finished" podID="606d56d8-87ca-48af-8adb-b22184238e89" containerID="2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb" exitCode=0 Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.088822 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"606d56d8-87ca-48af-8adb-b22184238e89","Type":"ContainerDied","Data":"2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb"} Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.088851 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"606d56d8-87ca-48af-8adb-b22184238e89","Type":"ContainerDied","Data":"b8110c510a862184f8954ef04cfd9e8fa5cf124f78fe95d49b7ca298faf93821"} Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.088857 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.088871 4956 scope.go:117] "RemoveContainer" containerID="2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.121178 4956 scope.go:117] "RemoveContainer" containerID="689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.126912 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.137591 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.142656 4956 scope.go:117] "RemoveContainer" containerID="2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb" Mar 14 09:36:51 crc kubenswrapper[4956]: E0314 09:36:51.143208 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb\": container with ID starting with 2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb not found: ID does not exist" containerID="2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.143243 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb"} err="failed to get container status \"2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb\": rpc error: code = NotFound desc = could not find container \"2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb\": container with ID starting with 2dbf3dbdb49d00c545475e75adfd9f9a7406968b69b8be25c51f7278998410cb not found: ID does not exist" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.143270 4956 scope.go:117] "RemoveContainer" containerID="689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8" Mar 14 09:36:51 crc kubenswrapper[4956]: E0314 09:36:51.143535 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8\": container with ID starting with 689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8 not found: ID does not exist" containerID="689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.143646 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8"} err="failed to get container status \"689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8\": rpc error: code = NotFound desc = could not find container \"689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8\": container with ID starting with 689efd49935da69a07ca5733d20eb39d8b136f30788b622f425181aa3724f9b8 not found: ID does not exist" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.145813 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:51 crc kubenswrapper[4956]: E0314 09:36:51.146104 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-api" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.146120 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-api" Mar 14 09:36:51 crc kubenswrapper[4956]: E0314 09:36:51.146133 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-kuttl-api-log" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.146139 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-kuttl-api-log" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.146304 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-api" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.146320 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="606d56d8-87ca-48af-8adb-b22184238e89" containerName="watcher-kuttl-api-log" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.147080 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.149405 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.149519 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.153782 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.160427 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.220913 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606d56d8-87ca-48af-8adb-b22184238e89" path="/var/lib/kubelet/pods/606d56d8-87ca-48af-8adb-b22184238e89/volumes" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.229779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.229869 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.229924 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.229986 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.230062 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4774\" (UniqueName: \"kubernetes.io/projected/b2815683-6505-4893-a546-159ee65da05a-kube-api-access-g4774\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.230102 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.230136 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2815683-6505-4893-a546-159ee65da05a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332033 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4774\" (UniqueName: \"kubernetes.io/projected/b2815683-6505-4893-a546-159ee65da05a-kube-api-access-g4774\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332096 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332123 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2815683-6505-4893-a546-159ee65da05a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332163 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332207 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332248 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.332987 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2815683-6505-4893-a546-159ee65da05a-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.335950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.336034 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.336269 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.336668 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.339128 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.349382 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4774\" (UniqueName: \"kubernetes.io/projected/b2815683-6505-4893-a546-159ee65da05a-kube-api-access-g4774\") pod \"watcher-kuttl-api-0\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.470278 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:51 crc kubenswrapper[4956]: I0314 09:36:51.865738 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:51 crc kubenswrapper[4956]: W0314 09:36:51.876145 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2815683_6505_4893_a546_159ee65da05a.slice/crio-1c1921919c255f32f0bcb67f3ea3bb37d93e9a7fc900a9c3c688f95fbeade28c WatchSource:0}: Error finding container 1c1921919c255f32f0bcb67f3ea3bb37d93e9a7fc900a9c3c688f95fbeade28c: Status 404 returned error can't find the container with id 1c1921919c255f32f0bcb67f3ea3bb37d93e9a7fc900a9c3c688f95fbeade28c Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.099116 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b2815683-6505-4893-a546-159ee65da05a","Type":"ContainerStarted","Data":"4b85fb3cc0ff0dfb62e9a42b6ca3ce97cca8f5644bb4bab4da643223e692bec4"} Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.099155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b2815683-6505-4893-a546-159ee65da05a","Type":"ContainerStarted","Data":"1c1921919c255f32f0bcb67f3ea3bb37d93e9a7fc900a9c3c688f95fbeade28c"} Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.925318 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd"] Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.934860 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-bjgfd"] Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.958270 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchered8a-account-delete-4487d"] Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.959825 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.998431 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:52 crc kubenswrapper[4956]: I0314 09:36:52.998701 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="973f66e6-bf42-4786-9330-65c6ca115a9c" containerName="watcher-applier" containerID="cri-o://71f24bdb3d02e8026ee0f519394df1c00564295e7cf5fc5120141adaff4dbe71" gracePeriod=30 Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.025024 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchered8a-account-delete-4487d"] Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.035596 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.059511 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef73ee0-b136-4016-bcbd-28a1590b3305-operator-scripts\") pod \"watchered8a-account-delete-4487d\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.059590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xvq\" (UniqueName: \"kubernetes.io/projected/6ef73ee0-b136-4016-bcbd-28a1590b3305-kube-api-access-75xvq\") pod \"watchered8a-account-delete-4487d\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.085897 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.086088 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="c68d3098-9fdb-434d-a7d7-796a245cba1c" containerName="watcher-decision-engine" containerID="cri-o://f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f" gracePeriod=30 Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.108273 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b2815683-6505-4893-a546-159ee65da05a","Type":"ContainerStarted","Data":"6da1eb1e2d0a14815be868d6b877588b2a9e7a6c59f7fed6b125567ed80b962c"} Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.108706 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.108715 4956 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-api-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-tlsdz\" not found" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.129698 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.129681576 podStartE2EDuration="2.129681576s" podCreationTimestamp="2026-03-14 09:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:53.125552641 +0000 UTC m=+2418.638244909" watchObservedRunningTime="2026-03-14 09:36:53.129681576 +0000 UTC m=+2418.642373844" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.161602 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xvq\" (UniqueName: \"kubernetes.io/projected/6ef73ee0-b136-4016-bcbd-28a1590b3305-kube-api-access-75xvq\") pod \"watchered8a-account-delete-4487d\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.162890 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef73ee0-b136-4016-bcbd-28a1590b3305-operator-scripts\") pod \"watchered8a-account-delete-4487d\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: E0314 09:36:53.162943 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:53 crc kubenswrapper[4956]: E0314 09:36:53.163720 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data podName:b2815683-6505-4893-a546-159ee65da05a nodeName:}" failed. No retries permitted until 2026-03-14 09:36:53.663705072 +0000 UTC m=+2419.176397350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data") pod "watcher-kuttl-api-0" (UID: "b2815683-6505-4893-a546-159ee65da05a") : secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.163585 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef73ee0-b136-4016-bcbd-28a1590b3305-operator-scripts\") pod \"watchered8a-account-delete-4487d\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.186231 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xvq\" (UniqueName: \"kubernetes.io/projected/6ef73ee0-b136-4016-bcbd-28a1590b3305-kube-api-access-75xvq\") pod \"watchered8a-account-delete-4487d\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.218932 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53baaaa-3c9e-4e73-a8ca-8ab58635e119" path="/var/lib/kubelet/pods/f53baaaa-3c9e-4e73-a8ca-8ab58635e119/volumes" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.276863 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:53 crc kubenswrapper[4956]: E0314 09:36:53.672728 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:53 crc kubenswrapper[4956]: E0314 09:36:53.673106 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data podName:b2815683-6505-4893-a546-159ee65da05a nodeName:}" failed. No retries permitted until 2026-03-14 09:36:54.673085065 +0000 UTC m=+2420.185777333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data") pod "watcher-kuttl-api-0" (UID: "b2815683-6505-4893-a546-159ee65da05a") : secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.741700 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchered8a-account-delete-4487d"] Mar 14 09:36:53 crc kubenswrapper[4956]: W0314 09:36:53.760994 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef73ee0_b136_4016_bcbd_28a1590b3305.slice/crio-4bb229cc032a384a401d110cee26077ee471e3c5d4500315fc6369674be2ebfd WatchSource:0}: Error finding container 4bb229cc032a384a401d110cee26077ee471e3c5d4500315fc6369674be2ebfd: Status 404 returned error can't find the container with id 4bb229cc032a384a401d110cee26077ee471e3c5d4500315fc6369674be2ebfd Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.880346 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.980989 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-config-data\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981335 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-combined-ca-bundle\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981358 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-ceilometer-tls-certs\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981405 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-sg-core-conf-yaml\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981444 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-run-httpd\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981463 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-log-httpd\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981498 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cpb2\" (UniqueName: \"kubernetes.io/projected/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-kube-api-access-7cpb2\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.981573 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-scripts\") pod \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\" (UID: \"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942\") " Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.984882 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.985060 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:53 crc kubenswrapper[4956]: I0314 09:36:53.998470 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-kube-api-access-7cpb2" (OuterVolumeSpecName: "kube-api-access-7cpb2") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "kube-api-access-7cpb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.006771 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-scripts" (OuterVolumeSpecName: "scripts") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.047543 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.085949 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.085991 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.086003 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.086011 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.086019 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cpb2\" (UniqueName: \"kubernetes.io/projected/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-kube-api-access-7cpb2\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.152868 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.155000 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" event={"ID":"6ef73ee0-b136-4016-bcbd-28a1590b3305","Type":"ContainerStarted","Data":"4a31ef57030e278350ebab884ff409db661aab9cd8d25e426395902db606ac73"} Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.155038 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" event={"ID":"6ef73ee0-b136-4016-bcbd-28a1590b3305","Type":"ContainerStarted","Data":"4bb229cc032a384a401d110cee26077ee471e3c5d4500315fc6369674be2ebfd"} Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.161666 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.163325 4956 generic.go:334] "Generic (PLEG): container finished" podID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerID="ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f" exitCode=0 Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.163387 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerDied","Data":"ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f"} Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.163416 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942","Type":"ContainerDied","Data":"573c7ba72179109b8b55513f683a8d1062cd6f53d8ca7a5865be18ec3763dedc"} Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.163437 4956 scope.go:117] "RemoveContainer" containerID="30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.163653 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.175145 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" podStartSLOduration=2.17510101 podStartE2EDuration="2.17510101s" podCreationTimestamp="2026-03-14 09:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:36:54.171602571 +0000 UTC m=+2419.684294839" watchObservedRunningTime="2026-03-14 09:36:54.17510101 +0000 UTC m=+2419.687793298" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.176892 4956 generic.go:334] "Generic (PLEG): container finished" podID="973f66e6-bf42-4786-9330-65c6ca115a9c" containerID="71f24bdb3d02e8026ee0f519394df1c00564295e7cf5fc5120141adaff4dbe71" exitCode=0 Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.177006 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"973f66e6-bf42-4786-9330-65c6ca115a9c","Type":"ContainerDied","Data":"71f24bdb3d02e8026ee0f519394df1c00564295e7cf5fc5120141adaff4dbe71"} Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.177293 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-kuttl-api-log" containerID="cri-o://4b85fb3cc0ff0dfb62e9a42b6ca3ce97cca8f5644bb4bab4da643223e692bec4" gracePeriod=30 Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.177472 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" containerID="cri-o://6da1eb1e2d0a14815be868d6b877588b2a9e7a6c59f7fed6b125567ed80b962c" gracePeriod=30 Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.183784 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": EOF" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.188026 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.188053 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.199856 4956 scope.go:117] "RemoveContainer" containerID="4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.208100 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-config-data" (OuterVolumeSpecName: "config-data") pod "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" (UID: "2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.210454 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.212224 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.221017 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.221350 4956 scope.go:117] "RemoveContainer" containerID="ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.253320 4956 scope.go:117] "RemoveContainer" containerID="ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.277581 4956 scope.go:117] "RemoveContainer" containerID="30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.278769 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e\": container with ID starting with 30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e not found: ID does not exist" containerID="30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.278808 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e"} err="failed to get container status \"30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e\": rpc error: code = NotFound desc = could not find container \"30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e\": container with ID starting with 30a9e48555c291ab01a4f9ba94eb4f63d58eb74badcc73b272d6c6fda9fa5c8e not found: ID does not exist" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.278827 4956 scope.go:117] "RemoveContainer" containerID="4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.279630 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed\": container with ID starting with 4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed not found: ID does not exist" containerID="4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.279655 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed"} err="failed to get container status \"4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed\": rpc error: code = NotFound desc = could not find container \"4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed\": container with ID starting with 4e47b0f89241b50715b810c0ef692b75b66274c21ea9afe7da0b4b8deda788ed not found: ID does not exist" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.279669 4956 scope.go:117] "RemoveContainer" containerID="ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.280020 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f\": container with ID starting with ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f not found: ID does not exist" containerID="ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.280046 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f"} err="failed to get container status \"ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f\": rpc error: code = NotFound desc = could not find container \"ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f\": container with ID starting with ed14ab9facbdc478241ee958f5652840916454b099599ab3a7a51253587dde8f not found: ID does not exist" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.280057 4956 scope.go:117] "RemoveContainer" containerID="ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.280426 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865\": container with ID starting with ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865 not found: ID does not exist" containerID="ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.280451 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865"} err="failed to get container status \"ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865\": rpc error: code = NotFound desc = could not find container \"ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865\": container with ID starting with ac5b27f33bb8aa47ac86d90c13c94fe606674fb0740fe3fa27b0f16065c99865 not found: ID does not exist" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.289490 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973f66e6-bf42-4786-9330-65c6ca115a9c-logs\") pod \"973f66e6-bf42-4786-9330-65c6ca115a9c\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.289578 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-config-data\") pod \"973f66e6-bf42-4786-9330-65c6ca115a9c\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.289622 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c29n\" (UniqueName: \"kubernetes.io/projected/973f66e6-bf42-4786-9330-65c6ca115a9c-kube-api-access-6c29n\") pod \"973f66e6-bf42-4786-9330-65c6ca115a9c\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.289680 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-combined-ca-bundle\") pod \"973f66e6-bf42-4786-9330-65c6ca115a9c\" (UID: \"973f66e6-bf42-4786-9330-65c6ca115a9c\") " Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.289842 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973f66e6-bf42-4786-9330-65c6ca115a9c-logs" (OuterVolumeSpecName: "logs") pod "973f66e6-bf42-4786-9330-65c6ca115a9c" (UID: "973f66e6-bf42-4786-9330-65c6ca115a9c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.290097 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973f66e6-bf42-4786-9330-65c6ca115a9c-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.290115 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.298057 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973f66e6-bf42-4786-9330-65c6ca115a9c-kube-api-access-6c29n" (OuterVolumeSpecName: "kube-api-access-6c29n") pod "973f66e6-bf42-4786-9330-65c6ca115a9c" (UID: "973f66e6-bf42-4786-9330-65c6ca115a9c"). InnerVolumeSpecName "kube-api-access-6c29n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.322810 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "973f66e6-bf42-4786-9330-65c6ca115a9c" (UID: "973f66e6-bf42-4786-9330-65c6ca115a9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.341009 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-config-data" (OuterVolumeSpecName: "config-data") pod "973f66e6-bf42-4786-9330-65c6ca115a9c" (UID: "973f66e6-bf42-4786-9330-65c6ca115a9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.391143 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.391910 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c29n\" (UniqueName: \"kubernetes.io/projected/973f66e6-bf42-4786-9330-65c6ca115a9c-kube-api-access-6c29n\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.391925 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973f66e6-bf42-4786-9330-65c6ca115a9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.509198 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.519471 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.530279 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.530692 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-notification-agent" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.530713 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-notification-agent" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.530728 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-central-agent" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.530738 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-central-agent" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.530753 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="sg-core" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.530763 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="sg-core" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.530795 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="proxy-httpd" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.530803 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="proxy-httpd" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.530833 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973f66e6-bf42-4786-9330-65c6ca115a9c" containerName="watcher-applier" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.530842 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="973f66e6-bf42-4786-9330-65c6ca115a9c" containerName="watcher-applier" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.531024 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="sg-core" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.531049 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-notification-agent" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.531067 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="973f66e6-bf42-4786-9330-65c6ca115a9c" containerName="watcher-applier" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.531080 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="ceilometer-central-agent" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.531094 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" containerName="proxy-httpd" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.532979 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.535472 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef73ee0_b136_4016_bcbd_28a1590b3305.slice/crio-conmon-4a31ef57030e278350ebab884ff409db661aab9cd8d25e426395902db606ac73.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef73ee0_b136_4016_bcbd_28a1590b3305.slice/crio-4a31ef57030e278350ebab884ff409db661aab9cd8d25e426395902db606ac73.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e1bfd92_a9b1_4fe9_8dc3_ef4736c59942.slice\": RecentStats: unable to find data in memory cache]" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.535537 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.546454 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.548427 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.566953 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.594938 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-run-httpd\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.594987 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.595006 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-config-data\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.595038 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-scripts\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.595078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8wl2\" (UniqueName: \"kubernetes.io/projected/ea7fdc2c-3663-40bd-92fb-f17f726de1df-kube-api-access-p8wl2\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.595117 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.595139 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.595248 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-log-httpd\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696266 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-run-httpd\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696314 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696335 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-config-data\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-scripts\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8wl2\" (UniqueName: \"kubernetes.io/projected/ea7fdc2c-3663-40bd-92fb-f17f726de1df-kube-api-access-p8wl2\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696457 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696492 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696525 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-log-httpd\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696867 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-run-httpd\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.696884 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-log-httpd\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.697024 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:54 crc kubenswrapper[4956]: E0314 09:36:54.697085 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data podName:b2815683-6505-4893-a546-159ee65da05a nodeName:}" failed. No retries permitted until 2026-03-14 09:36:56.697065102 +0000 UTC m=+2422.209757370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data") pod "watcher-kuttl-api-0" (UID: "b2815683-6505-4893-a546-159ee65da05a") : secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.700669 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.700960 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.701217 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.702349 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-scripts\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.705474 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-config-data\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.716741 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8wl2\" (UniqueName: \"kubernetes.io/projected/ea7fdc2c-3663-40bd-92fb-f17f726de1df-kube-api-access-p8wl2\") pod \"ceilometer-0\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:54 crc kubenswrapper[4956]: I0314 09:36:54.872913 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.186749 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"973f66e6-bf42-4786-9330-65c6ca115a9c","Type":"ContainerDied","Data":"3971f5cdbedccafc4767af72e3501c4d87e5f6eb33dbdd09e1cb2df2af891f3b"} Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.186801 4956 scope.go:117] "RemoveContainer" containerID="71f24bdb3d02e8026ee0f519394df1c00564295e7cf5fc5120141adaff4dbe71" Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.186767 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.189588 4956 generic.go:334] "Generic (PLEG): container finished" podID="6ef73ee0-b136-4016-bcbd-28a1590b3305" containerID="4a31ef57030e278350ebab884ff409db661aab9cd8d25e426395902db606ac73" exitCode=0 Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.189643 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" event={"ID":"6ef73ee0-b136-4016-bcbd-28a1590b3305","Type":"ContainerDied","Data":"4a31ef57030e278350ebab884ff409db661aab9cd8d25e426395902db606ac73"} Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.192992 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2815683-6505-4893-a546-159ee65da05a" containerID="4b85fb3cc0ff0dfb62e9a42b6ca3ce97cca8f5644bb4bab4da643223e692bec4" exitCode=143 Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.193046 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b2815683-6505-4893-a546-159ee65da05a","Type":"ContainerDied","Data":"4b85fb3cc0ff0dfb62e9a42b6ca3ce97cca8f5644bb4bab4da643223e692bec4"} Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.220757 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942" path="/var/lib/kubelet/pods/2e1bfd92-a9b1-4fe9-8dc3-ef4736c59942/volumes" Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.240280 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.247764 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.309619 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:55 crc kubenswrapper[4956]: W0314 09:36:55.318516 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea7fdc2c_3663_40bd_92fb_f17f726de1df.slice/crio-611cff9d9ffc7973ef035cd47eb5fc757ae8f66427a427ce0fb2c0d9dc0d2712 WatchSource:0}: Error finding container 611cff9d9ffc7973ef035cd47eb5fc757ae8f66427a427ce0fb2c0d9dc0d2712: Status 404 returned error can't find the container with id 611cff9d9ffc7973ef035cd47eb5fc757ae8f66427a427ce0fb2c0d9dc0d2712 Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.459838 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:36:55 crc kubenswrapper[4956]: I0314 09:36:55.571554 4956 scope.go:117] "RemoveContainer" containerID="3869937ff7b9ac2d31c83c6cb681ff0d14457d6147b97297ad97bec680ab6536" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.203748 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerStarted","Data":"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88"} Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.204000 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerStarted","Data":"611cff9d9ffc7973ef035cd47eb5fc757ae8f66427a427ce0fb2c0d9dc0d2712"} Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.470593 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.549041 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.628736 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75xvq\" (UniqueName: \"kubernetes.io/projected/6ef73ee0-b136-4016-bcbd-28a1590b3305-kube-api-access-75xvq\") pod \"6ef73ee0-b136-4016-bcbd-28a1590b3305\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.628885 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef73ee0-b136-4016-bcbd-28a1590b3305-operator-scripts\") pod \"6ef73ee0-b136-4016-bcbd-28a1590b3305\" (UID: \"6ef73ee0-b136-4016-bcbd-28a1590b3305\") " Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.629577 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef73ee0-b136-4016-bcbd-28a1590b3305-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ef73ee0-b136-4016-bcbd-28a1590b3305" (UID: "6ef73ee0-b136-4016-bcbd-28a1590b3305"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.632069 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef73ee0-b136-4016-bcbd-28a1590b3305-kube-api-access-75xvq" (OuterVolumeSpecName: "kube-api-access-75xvq") pod "6ef73ee0-b136-4016-bcbd-28a1590b3305" (UID: "6ef73ee0-b136-4016-bcbd-28a1590b3305"). InnerVolumeSpecName "kube-api-access-75xvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.730754 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75xvq\" (UniqueName: \"kubernetes.io/projected/6ef73ee0-b136-4016-bcbd-28a1590b3305-kube-api-access-75xvq\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.730778 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef73ee0-b136-4016-bcbd-28a1590b3305-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:56 crc kubenswrapper[4956]: E0314 09:36:56.730856 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:56 crc kubenswrapper[4956]: E0314 09:36:56.730896 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data podName:b2815683-6505-4893-a546-159ee65da05a nodeName:}" failed. No retries permitted until 2026-03-14 09:37:00.730880486 +0000 UTC m=+2426.243572754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data") pod "watcher-kuttl-api-0" (UID: "b2815683-6505-4893-a546-159ee65da05a") : secret "watcher-kuttl-api-config-data" not found Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.924411 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": read tcp 10.217.0.2:50716->10.217.0.185:9322: read: connection reset by peer" Mar 14 09:36:56 crc kubenswrapper[4956]: I0314 09:36:56.924909 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.185:9322/\": dial tcp 10.217.0.185:9322: connect: connection refused" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.226676 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973f66e6-bf42-4786-9330-65c6ca115a9c" path="/var/lib/kubelet/pods/973f66e6-bf42-4786-9330-65c6ca115a9c/volumes" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.226827 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.231173 4956 generic.go:334] "Generic (PLEG): container finished" podID="b2815683-6505-4893-a546-159ee65da05a" containerID="6da1eb1e2d0a14815be868d6b877588b2a9e7a6c59f7fed6b125567ed80b962c" exitCode=0 Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.234080 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerStarted","Data":"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065"} Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.234112 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchered8a-account-delete-4487d" event={"ID":"6ef73ee0-b136-4016-bcbd-28a1590b3305","Type":"ContainerDied","Data":"4bb229cc032a384a401d110cee26077ee471e3c5d4500315fc6369674be2ebfd"} Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.234129 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bb229cc032a384a401d110cee26077ee471e3c5d4500315fc6369674be2ebfd" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.234138 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b2815683-6505-4893-a546-159ee65da05a","Type":"ContainerDied","Data":"6da1eb1e2d0a14815be868d6b877588b2a9e7a6c59f7fed6b125567ed80b962c"} Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.336418 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-custom-prometheus-ca\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441199 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-internal-tls-certs\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441740 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-combined-ca-bundle\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441774 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4774\" (UniqueName: \"kubernetes.io/projected/b2815683-6505-4893-a546-159ee65da05a-kube-api-access-g4774\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441801 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441831 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2815683-6505-4893-a546-159ee65da05a-logs\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.441900 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-public-tls-certs\") pod \"b2815683-6505-4893-a546-159ee65da05a\" (UID: \"b2815683-6505-4893-a546-159ee65da05a\") " Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.442237 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2815683-6505-4893-a546-159ee65da05a-logs" (OuterVolumeSpecName: "logs") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.445651 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2815683-6505-4893-a546-159ee65da05a-kube-api-access-g4774" (OuterVolumeSpecName: "kube-api-access-g4774") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "kube-api-access-g4774". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.469163 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.471366 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.485061 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data" (OuterVolumeSpecName: "config-data") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.485880 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.495099 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b2815683-6505-4893-a546-159ee65da05a" (UID: "b2815683-6505-4893-a546-159ee65da05a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.543629 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.543735 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4774\" (UniqueName: \"kubernetes.io/projected/b2815683-6505-4893-a546-159ee65da05a-kube-api-access-g4774\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.543794 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.543889 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2815683-6505-4893-a546-159ee65da05a-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.543943 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.543994 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:57 crc kubenswrapper[4956]: I0314 09:36:57.544045 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2815683-6505-4893-a546-159ee65da05a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.004041 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zzrgx"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.015380 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zzrgx"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.033532 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.042567 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-ed8a-account-create-update-rxbjs"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.048930 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchered8a-account-delete-4487d"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.054226 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchered8a-account-delete-4487d"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.240983 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerStarted","Data":"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88"} Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.242744 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b2815683-6505-4893-a546-159ee65da05a","Type":"ContainerDied","Data":"1c1921919c255f32f0bcb67f3ea3bb37d93e9a7fc900a9c3c688f95fbeade28c"} Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.242806 4956 scope.go:117] "RemoveContainer" containerID="6da1eb1e2d0a14815be868d6b877588b2a9e7a6c59f7fed6b125567ed80b962c" Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.242808 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.264517 4956 scope.go:117] "RemoveContainer" containerID="4b85fb3cc0ff0dfb62e9a42b6ca3ce97cca8f5644bb4bab4da643223e692bec4" Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.277690 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:58 crc kubenswrapper[4956]: I0314 09:36:58.287272 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.230121 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e7f711-f5e2-4517-bdb5-6c9a3eca6c82" path="/var/lib/kubelet/pods/21e7f711-f5e2-4517-bdb5-6c9a3eca6c82/volumes" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.231620 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef73ee0-b136-4016-bcbd-28a1590b3305" path="/var/lib/kubelet/pods/6ef73ee0-b136-4016-bcbd-28a1590b3305/volumes" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.232558 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa07278-73b8-4673-98ed-0e617c7426f9" path="/var/lib/kubelet/pods/afa07278-73b8-4673-98ed-0e617c7426f9/volumes" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.234756 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2815683-6505-4893-a546-159ee65da05a" path="/var/lib/kubelet/pods/b2815683-6505-4893-a546-159ee65da05a/volumes" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.257247 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerStarted","Data":"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79"} Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.257324 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.257338 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-central-agent" containerID="cri-o://6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" gracePeriod=30 Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.257428 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="proxy-httpd" containerID="cri-o://c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" gracePeriod=30 Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.257471 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="sg-core" containerID="cri-o://649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" gracePeriod=30 Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.257568 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-notification-agent" containerID="cri-o://74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" gracePeriod=30 Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.276843 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.867222488 podStartE2EDuration="5.276827143s" podCreationTimestamp="2026-03-14 09:36:54 +0000 UTC" firstStartedPulling="2026-03-14 09:36:55.320785933 +0000 UTC m=+2420.833478211" lastFinishedPulling="2026-03-14 09:36:58.730390598 +0000 UTC m=+2424.243082866" observedRunningTime="2026-03-14 09:36:59.272904673 +0000 UTC m=+2424.785596941" watchObservedRunningTime="2026-03-14 09:36:59.276827143 +0000 UTC m=+2424.789519411" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.855187 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.879523 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-config-data\") pod \"c68d3098-9fdb-434d-a7d7-796a245cba1c\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.879614 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-combined-ca-bundle\") pod \"c68d3098-9fdb-434d-a7d7-796a245cba1c\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.879683 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcpwk\" (UniqueName: \"kubernetes.io/projected/c68d3098-9fdb-434d-a7d7-796a245cba1c-kube-api-access-rcpwk\") pod \"c68d3098-9fdb-434d-a7d7-796a245cba1c\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.879767 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-custom-prometheus-ca\") pod \"c68d3098-9fdb-434d-a7d7-796a245cba1c\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.879832 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68d3098-9fdb-434d-a7d7-796a245cba1c-logs\") pod \"c68d3098-9fdb-434d-a7d7-796a245cba1c\" (UID: \"c68d3098-9fdb-434d-a7d7-796a245cba1c\") " Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.881052 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68d3098-9fdb-434d-a7d7-796a245cba1c-logs" (OuterVolumeSpecName: "logs") pod "c68d3098-9fdb-434d-a7d7-796a245cba1c" (UID: "c68d3098-9fdb-434d-a7d7-796a245cba1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.885728 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68d3098-9fdb-434d-a7d7-796a245cba1c-kube-api-access-rcpwk" (OuterVolumeSpecName: "kube-api-access-rcpwk") pod "c68d3098-9fdb-434d-a7d7-796a245cba1c" (UID: "c68d3098-9fdb-434d-a7d7-796a245cba1c"). InnerVolumeSpecName "kube-api-access-rcpwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.905007 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c68d3098-9fdb-434d-a7d7-796a245cba1c" (UID: "c68d3098-9fdb-434d-a7d7-796a245cba1c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.918924 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c68d3098-9fdb-434d-a7d7-796a245cba1c" (UID: "c68d3098-9fdb-434d-a7d7-796a245cba1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.928139 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-config-data" (OuterVolumeSpecName: "config-data") pod "c68d3098-9fdb-434d-a7d7-796a245cba1c" (UID: "c68d3098-9fdb-434d-a7d7-796a245cba1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.981817 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.981847 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c68d3098-9fdb-434d-a7d7-796a245cba1c-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.981857 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.981866 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68d3098-9fdb-434d-a7d7-796a245cba1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.981876 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcpwk\" (UniqueName: \"kubernetes.io/projected/c68d3098-9fdb-434d-a7d7-796a245cba1c-kube-api-access-rcpwk\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:59 crc kubenswrapper[4956]: I0314 09:36:59.993609 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083081 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8wl2\" (UniqueName: \"kubernetes.io/projected/ea7fdc2c-3663-40bd-92fb-f17f726de1df-kube-api-access-p8wl2\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083183 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-log-httpd\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083228 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-config-data\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083255 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-run-httpd\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083336 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-combined-ca-bundle\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083367 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-ceilometer-tls-certs\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083383 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-sg-core-conf-yaml\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083470 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-scripts\") pod \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\" (UID: \"ea7fdc2c-3663-40bd-92fb-f17f726de1df\") " Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083788 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083804 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083943 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.083955 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea7fdc2c-3663-40bd-92fb-f17f726de1df-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.086190 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-scripts" (OuterVolumeSpecName: "scripts") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.086207 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7fdc2c-3663-40bd-92fb-f17f726de1df-kube-api-access-p8wl2" (OuterVolumeSpecName: "kube-api-access-p8wl2") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "kube-api-access-p8wl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.110064 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.133747 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.138384 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.146912 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-config-data" (OuterVolumeSpecName: "config-data") pod "ea7fdc2c-3663-40bd-92fb-f17f726de1df" (UID: "ea7fdc2c-3663-40bd-92fb-f17f726de1df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.185152 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.185184 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.185198 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.185208 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.185217 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7fdc2c-3663-40bd-92fb-f17f726de1df-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.185227 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8wl2\" (UniqueName: \"kubernetes.io/projected/ea7fdc2c-3663-40bd-92fb-f17f726de1df-kube-api-access-p8wl2\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274174 4956 generic.go:334] "Generic (PLEG): container finished" podID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" exitCode=0 Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274201 4956 generic.go:334] "Generic (PLEG): container finished" podID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" exitCode=2 Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274251 4956 generic.go:334] "Generic (PLEG): container finished" podID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" exitCode=0 Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274260 4956 generic.go:334] "Generic (PLEG): container finished" podID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" exitCode=0 Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274229 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274233 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerDied","Data":"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274310 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerDied","Data":"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274321 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerDied","Data":"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274330 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerDied","Data":"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274345 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ea7fdc2c-3663-40bd-92fb-f17f726de1df","Type":"ContainerDied","Data":"611cff9d9ffc7973ef035cd47eb5fc757ae8f66427a427ce0fb2c0d9dc0d2712"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.274356 4956 scope.go:117] "RemoveContainer" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.276266 4956 generic.go:334] "Generic (PLEG): container finished" podID="c68d3098-9fdb-434d-a7d7-796a245cba1c" containerID="f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f" exitCode=0 Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.276288 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"c68d3098-9fdb-434d-a7d7-796a245cba1c","Type":"ContainerDied","Data":"f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.276306 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"c68d3098-9fdb-434d-a7d7-796a245cba1c","Type":"ContainerDied","Data":"b3e08feea66ccc347f80e9f9f3a0f1e0e947e412e5727c2d6f9666536d6fd559"} Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.276343 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.292942 4956 scope.go:117] "RemoveContainer" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.320732 4956 scope.go:117] "RemoveContainer" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.322523 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.340458 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.350249 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.355785 4956 scope.go:117] "RemoveContainer" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.359731 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.369713 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370144 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-central-agent" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370159 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-central-agent" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370170 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="sg-core" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370178 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="sg-core" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370193 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68d3098-9fdb-434d-a7d7-796a245cba1c" containerName="watcher-decision-engine" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370201 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68d3098-9fdb-434d-a7d7-796a245cba1c" containerName="watcher-decision-engine" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370226 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef73ee0-b136-4016-bcbd-28a1590b3305" containerName="mariadb-account-delete" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370234 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef73ee0-b136-4016-bcbd-28a1590b3305" containerName="mariadb-account-delete" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370249 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="proxy-httpd" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370256 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="proxy-httpd" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370276 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-kuttl-api-log" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370285 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-kuttl-api-log" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370299 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370308 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.370323 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-notification-agent" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370332 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-notification-agent" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370542 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-notification-agent" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370557 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="sg-core" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370571 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="ceilometer-central-agent" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370582 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-kuttl-api-log" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370600 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68d3098-9fdb-434d-a7d7-796a245cba1c" containerName="watcher-decision-engine" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370615 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef73ee0-b136-4016-bcbd-28a1590b3305" containerName="mariadb-account-delete" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370628 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" containerName="proxy-httpd" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.370640 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2815683-6505-4893-a546-159ee65da05a" containerName="watcher-api" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.375980 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.377812 4956 scope.go:117] "RemoveContainer" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.381789 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.381962 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": container with ID starting with c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79 not found: ID does not exist" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382014 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79"} err="failed to get container status \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": rpc error: code = NotFound desc = could not find container \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": container with ID starting with c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382044 4956 scope.go:117] "RemoveContainer" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382248 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.382435 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": container with ID starting with 649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88 not found: ID does not exist" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382460 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88"} err="failed to get container status \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": rpc error: code = NotFound desc = could not find container \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": container with ID starting with 649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382478 4956 scope.go:117] "RemoveContainer" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382567 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.382616 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.383767 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": container with ID starting with 74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065 not found: ID does not exist" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.383793 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065"} err="failed to get container status \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": rpc error: code = NotFound desc = could not find container \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": container with ID starting with 74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.383816 4956 scope.go:117] "RemoveContainer" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.387741 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": container with ID starting with 6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88 not found: ID does not exist" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.387787 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88"} err="failed to get container status \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": rpc error: code = NotFound desc = could not find container \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": container with ID starting with 6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.387820 4956 scope.go:117] "RemoveContainer" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.388266 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79"} err="failed to get container status \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": rpc error: code = NotFound desc = could not find container \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": container with ID starting with c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.388299 4956 scope.go:117] "RemoveContainer" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.391773 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88"} err="failed to get container status \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": rpc error: code = NotFound desc = could not find container \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": container with ID starting with 649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.391807 4956 scope.go:117] "RemoveContainer" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.393445 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065"} err="failed to get container status \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": rpc error: code = NotFound desc = could not find container \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": container with ID starting with 74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.393469 4956 scope.go:117] "RemoveContainer" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.393797 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88"} err="failed to get container status \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": rpc error: code = NotFound desc = could not find container \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": container with ID starting with 6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.393819 4956 scope.go:117] "RemoveContainer" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.394234 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79"} err="failed to get container status \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": rpc error: code = NotFound desc = could not find container \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": container with ID starting with c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.394261 4956 scope.go:117] "RemoveContainer" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.394746 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88"} err="failed to get container status \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": rpc error: code = NotFound desc = could not find container \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": container with ID starting with 649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.394772 4956 scope.go:117] "RemoveContainer" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.398356 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065"} err="failed to get container status \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": rpc error: code = NotFound desc = could not find container \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": container with ID starting with 74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.398395 4956 scope.go:117] "RemoveContainer" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.398876 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88"} err="failed to get container status \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": rpc error: code = NotFound desc = could not find container \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": container with ID starting with 6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.398920 4956 scope.go:117] "RemoveContainer" containerID="c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.399325 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79"} err="failed to get container status \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": rpc error: code = NotFound desc = could not find container \"c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79\": container with ID starting with c78eeeb24bac41a666a5cd5b2da7cf88294ee37fdda37ca5bf0cb1f11672ad79 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.399350 4956 scope.go:117] "RemoveContainer" containerID="649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.399620 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88"} err="failed to get container status \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": rpc error: code = NotFound desc = could not find container \"649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88\": container with ID starting with 649e7a82ba192187fbb829be7d2a1b270da360bdeb5bd31ec156d2450a846e88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.399639 4956 scope.go:117] "RemoveContainer" containerID="74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.399873 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065"} err="failed to get container status \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": rpc error: code = NotFound desc = could not find container \"74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065\": container with ID starting with 74940e83997a532a6c60cae5dbfafd304b42a6826472ddc4b036a3b81a160065 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.399889 4956 scope.go:117] "RemoveContainer" containerID="6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.400123 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88"} err="failed to get container status \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": rpc error: code = NotFound desc = could not find container \"6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88\": container with ID starting with 6df3d16ba8e0cd71b261c256771d032451e7d52012f4a454e717c5a60a71fa88 not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.400141 4956 scope.go:117] "RemoveContainer" containerID="f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.420834 4956 scope.go:117] "RemoveContainer" containerID="f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f" Mar 14 09:37:00 crc kubenswrapper[4956]: E0314 09:37:00.421220 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f\": container with ID starting with f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f not found: ID does not exist" containerID="f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.421287 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f"} err="failed to get container status \"f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f\": rpc error: code = NotFound desc = could not find container \"f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f\": container with ID starting with f06a5f6604d07e00fca82447a564124b08b35fc856be7e3d214c24bbef2eec4f not found: ID does not exist" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488627 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-run-httpd\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488672 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-scripts\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488702 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488724 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkwc\" (UniqueName: \"kubernetes.io/projected/f6925581-38d4-4d3e-b270-2924e26ec93f-kube-api-access-xdkwc\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-config-data\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488905 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-log-httpd\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.488970 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.489114 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590545 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-scripts\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590578 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-run-httpd\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590611 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590635 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdkwc\" (UniqueName: \"kubernetes.io/projected/f6925581-38d4-4d3e-b270-2924e26ec93f-kube-api-access-xdkwc\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590653 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-config-data\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590683 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-log-httpd\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.590958 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.591002 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.591148 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-log-httpd\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.591364 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-run-httpd\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.594308 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.594547 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.595064 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.596018 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-scripts\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.602770 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-config-data\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.607974 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdkwc\" (UniqueName: \"kubernetes.io/projected/f6925581-38d4-4d3e-b270-2924e26ec93f-kube-api-access-xdkwc\") pod \"ceilometer-0\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:00 crc kubenswrapper[4956]: I0314 09:37:00.702214 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.022026 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-s62jg"] Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.023715 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.027803 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r"] Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.028861 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.033654 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r"] Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.034711 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.043213 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-s62jg"] Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.103098 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96054dc2-caf1-4272-9615-05b7611d9644-operator-scripts\") pod \"watcher-db-create-s62jg\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.103172 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vrm\" (UniqueName: \"kubernetes.io/projected/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-kube-api-access-58vrm\") pod \"watcher-6c35-account-create-update-tbl6r\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.103239 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thf6t\" (UniqueName: \"kubernetes.io/projected/96054dc2-caf1-4272-9615-05b7611d9644-kube-api-access-thf6t\") pod \"watcher-db-create-s62jg\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.103266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-operator-scripts\") pod \"watcher-6c35-account-create-update-tbl6r\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.158131 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.205390 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96054dc2-caf1-4272-9615-05b7611d9644-operator-scripts\") pod \"watcher-db-create-s62jg\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.205496 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58vrm\" (UniqueName: \"kubernetes.io/projected/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-kube-api-access-58vrm\") pod \"watcher-6c35-account-create-update-tbl6r\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.205743 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thf6t\" (UniqueName: \"kubernetes.io/projected/96054dc2-caf1-4272-9615-05b7611d9644-kube-api-access-thf6t\") pod \"watcher-db-create-s62jg\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.205781 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-operator-scripts\") pod \"watcher-6c35-account-create-update-tbl6r\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.206300 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96054dc2-caf1-4272-9615-05b7611d9644-operator-scripts\") pod \"watcher-db-create-s62jg\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.207047 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-operator-scripts\") pod \"watcher-6c35-account-create-update-tbl6r\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.218075 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68d3098-9fdb-434d-a7d7-796a245cba1c" path="/var/lib/kubelet/pods/c68d3098-9fdb-434d-a7d7-796a245cba1c/volumes" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.219067 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7fdc2c-3663-40bd-92fb-f17f726de1df" path="/var/lib/kubelet/pods/ea7fdc2c-3663-40bd-92fb-f17f726de1df/volumes" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.225098 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vrm\" (UniqueName: \"kubernetes.io/projected/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-kube-api-access-58vrm\") pod \"watcher-6c35-account-create-update-tbl6r\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.225788 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thf6t\" (UniqueName: \"kubernetes.io/projected/96054dc2-caf1-4272-9615-05b7611d9644-kube-api-access-thf6t\") pod \"watcher-db-create-s62jg\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.284367 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerStarted","Data":"a99ada92801949f65a4b7b3a4694c5d975e2a9cf24348332944da323a87992c0"} Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.355866 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.364122 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.806580 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-s62jg"] Mar 14 09:37:01 crc kubenswrapper[4956]: I0314 09:37:01.886103 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r"] Mar 14 09:37:01 crc kubenswrapper[4956]: W0314 09:37:01.892522 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc28fba1d_4efd_4cf3_8f90_6a2df7ebbef5.slice/crio-d685a9479264cf5b6743dc7cf2b2fc53e5f68f021b035b48dd64e2cc8f2a6472 WatchSource:0}: Error finding container d685a9479264cf5b6743dc7cf2b2fc53e5f68f021b035b48dd64e2cc8f2a6472: Status 404 returned error can't find the container with id d685a9479264cf5b6743dc7cf2b2fc53e5f68f021b035b48dd64e2cc8f2a6472 Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.307252 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-s62jg" event={"ID":"96054dc2-caf1-4272-9615-05b7611d9644","Type":"ContainerStarted","Data":"6f4983b6892f340840caef852dc183bea3837594e5c71f519e087cd83fe3da32"} Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.307298 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-s62jg" event={"ID":"96054dc2-caf1-4272-9615-05b7611d9644","Type":"ContainerStarted","Data":"833e7e48a0e94ddc907e915dc09ba405c7c113ae9f5e066e958433f13b2c4c58"} Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.314768 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerStarted","Data":"af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0"} Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.316173 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" event={"ID":"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5","Type":"ContainerStarted","Data":"04bb138cd4fd52e8e318184b642c4ab3a690080f8780675cd6f1ee1361022b4e"} Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.316214 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" event={"ID":"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5","Type":"ContainerStarted","Data":"d685a9479264cf5b6743dc7cf2b2fc53e5f68f021b035b48dd64e2cc8f2a6472"} Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.329773 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-s62jg" podStartSLOduration=2.3297570309999998 podStartE2EDuration="2.329757031s" podCreationTimestamp="2026-03-14 09:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:02.320393863 +0000 UTC m=+2427.833086151" watchObservedRunningTime="2026-03-14 09:37:02.329757031 +0000 UTC m=+2427.842449299" Mar 14 09:37:02 crc kubenswrapper[4956]: I0314 09:37:02.351473 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" podStartSLOduration=2.351454803 podStartE2EDuration="2.351454803s" podCreationTimestamp="2026-03-14 09:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:02.337396686 +0000 UTC m=+2427.850088964" watchObservedRunningTime="2026-03-14 09:37:02.351454803 +0000 UTC m=+2427.864147071" Mar 14 09:37:03 crc kubenswrapper[4956]: I0314 09:37:03.335618 4956 generic.go:334] "Generic (PLEG): container finished" podID="c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" containerID="04bb138cd4fd52e8e318184b642c4ab3a690080f8780675cd6f1ee1361022b4e" exitCode=0 Mar 14 09:37:03 crc kubenswrapper[4956]: I0314 09:37:03.335932 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" event={"ID":"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5","Type":"ContainerDied","Data":"04bb138cd4fd52e8e318184b642c4ab3a690080f8780675cd6f1ee1361022b4e"} Mar 14 09:37:03 crc kubenswrapper[4956]: I0314 09:37:03.337613 4956 generic.go:334] "Generic (PLEG): container finished" podID="96054dc2-caf1-4272-9615-05b7611d9644" containerID="6f4983b6892f340840caef852dc183bea3837594e5c71f519e087cd83fe3da32" exitCode=0 Mar 14 09:37:03 crc kubenswrapper[4956]: I0314 09:37:03.337669 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-s62jg" event={"ID":"96054dc2-caf1-4272-9615-05b7611d9644","Type":"ContainerDied","Data":"6f4983b6892f340840caef852dc183bea3837594e5c71f519e087cd83fe3da32"} Mar 14 09:37:03 crc kubenswrapper[4956]: I0314 09:37:03.339764 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerStarted","Data":"f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349"} Mar 14 09:37:03 crc kubenswrapper[4956]: I0314 09:37:03.339813 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerStarted","Data":"4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36"} Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.774924 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.780229 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.872252 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thf6t\" (UniqueName: \"kubernetes.io/projected/96054dc2-caf1-4272-9615-05b7611d9644-kube-api-access-thf6t\") pod \"96054dc2-caf1-4272-9615-05b7611d9644\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.872416 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96054dc2-caf1-4272-9615-05b7611d9644-operator-scripts\") pod \"96054dc2-caf1-4272-9615-05b7611d9644\" (UID: \"96054dc2-caf1-4272-9615-05b7611d9644\") " Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.872516 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-operator-scripts\") pod \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.872658 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58vrm\" (UniqueName: \"kubernetes.io/projected/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-kube-api-access-58vrm\") pod \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\" (UID: \"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5\") " Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.873066 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96054dc2-caf1-4272-9615-05b7611d9644-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96054dc2-caf1-4272-9615-05b7611d9644" (UID: "96054dc2-caf1-4272-9615-05b7611d9644"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.873137 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" (UID: "c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.873607 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.873634 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96054dc2-caf1-4272-9615-05b7611d9644-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.876535 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-kube-api-access-58vrm" (OuterVolumeSpecName: "kube-api-access-58vrm") pod "c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" (UID: "c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5"). InnerVolumeSpecName "kube-api-access-58vrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.876642 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96054dc2-caf1-4272-9615-05b7611d9644-kube-api-access-thf6t" (OuterVolumeSpecName: "kube-api-access-thf6t") pod "96054dc2-caf1-4272-9615-05b7611d9644" (UID: "96054dc2-caf1-4272-9615-05b7611d9644"). InnerVolumeSpecName "kube-api-access-thf6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.975160 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58vrm\" (UniqueName: \"kubernetes.io/projected/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5-kube-api-access-58vrm\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:04 crc kubenswrapper[4956]: I0314 09:37:04.975209 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thf6t\" (UniqueName: \"kubernetes.io/projected/96054dc2-caf1-4272-9615-05b7611d9644-kube-api-access-thf6t\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.225107 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:37:05 crc kubenswrapper[4956]: E0314 09:37:05.225807 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.361623 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" event={"ID":"c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5","Type":"ContainerDied","Data":"d685a9479264cf5b6743dc7cf2b2fc53e5f68f021b035b48dd64e2cc8f2a6472"} Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.361671 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d685a9479264cf5b6743dc7cf2b2fc53e5f68f021b035b48dd64e2cc8f2a6472" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.361687 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.364026 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-s62jg" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.364065 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-s62jg" event={"ID":"96054dc2-caf1-4272-9615-05b7611d9644","Type":"ContainerDied","Data":"833e7e48a0e94ddc907e915dc09ba405c7c113ae9f5e066e958433f13b2c4c58"} Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.364131 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="833e7e48a0e94ddc907e915dc09ba405c7c113ae9f5e066e958433f13b2c4c58" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.367441 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerStarted","Data":"f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90"} Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.369043 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:05 crc kubenswrapper[4956]: I0314 09:37:05.422266 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.028205709 podStartE2EDuration="5.422229297s" podCreationTimestamp="2026-03-14 09:37:00 +0000 UTC" firstStartedPulling="2026-03-14 09:37:01.169156738 +0000 UTC m=+2426.681849006" lastFinishedPulling="2026-03-14 09:37:04.563180326 +0000 UTC m=+2430.075872594" observedRunningTime="2026-03-14 09:37:05.403243433 +0000 UTC m=+2430.915935711" watchObservedRunningTime="2026-03-14 09:37:05.422229297 +0000 UTC m=+2430.934921575" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.260234 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-46924"] Mar 14 09:37:06 crc kubenswrapper[4956]: E0314 09:37:06.260573 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96054dc2-caf1-4272-9615-05b7611d9644" containerName="mariadb-database-create" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.260585 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="96054dc2-caf1-4272-9615-05b7611d9644" containerName="mariadb-database-create" Mar 14 09:37:06 crc kubenswrapper[4956]: E0314 09:37:06.260616 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" containerName="mariadb-account-create-update" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.260621 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" containerName="mariadb-account-create-update" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.260753 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="96054dc2-caf1-4272-9615-05b7611d9644" containerName="mariadb-database-create" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.260768 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" containerName="mariadb-account-create-update" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.261317 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.263279 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.263682 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-6xvhr" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.272267 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-46924"] Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.295402 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxbq\" (UniqueName: \"kubernetes.io/projected/75517d04-8feb-4d02-8b05-1f6fed48f03d-kube-api-access-fsxbq\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.295612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.295673 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-config-data\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.295721 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.399251 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-config-data\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.399381 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.399735 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxbq\" (UniqueName: \"kubernetes.io/projected/75517d04-8feb-4d02-8b05-1f6fed48f03d-kube-api-access-fsxbq\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.402790 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.428396 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.428638 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-config-data\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.432317 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxbq\" (UniqueName: \"kubernetes.io/projected/75517d04-8feb-4d02-8b05-1f6fed48f03d-kube-api-access-fsxbq\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.434950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-46924\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:06 crc kubenswrapper[4956]: I0314 09:37:06.583221 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:07 crc kubenswrapper[4956]: I0314 09:37:07.434923 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-46924"] Mar 14 09:37:08 crc kubenswrapper[4956]: I0314 09:37:08.393553 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" event={"ID":"75517d04-8feb-4d02-8b05-1f6fed48f03d","Type":"ContainerStarted","Data":"11b83dcd3b5441b2e4b7e4572e10a3322a09d12bdc5edb40a889ee5ba430e389"} Mar 14 09:37:08 crc kubenswrapper[4956]: I0314 09:37:08.393903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" event={"ID":"75517d04-8feb-4d02-8b05-1f6fed48f03d","Type":"ContainerStarted","Data":"af7fbe16784ae61faaf5466dfbe471bb54dc01a104e44dab1501f3da8e519644"} Mar 14 09:37:08 crc kubenswrapper[4956]: I0314 09:37:08.411059 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" podStartSLOduration=2.411039883 podStartE2EDuration="2.411039883s" podCreationTimestamp="2026-03-14 09:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:08.406326133 +0000 UTC m=+2433.919018401" watchObservedRunningTime="2026-03-14 09:37:08.411039883 +0000 UTC m=+2433.923732151" Mar 14 09:37:10 crc kubenswrapper[4956]: I0314 09:37:10.415562 4956 generic.go:334] "Generic (PLEG): container finished" podID="75517d04-8feb-4d02-8b05-1f6fed48f03d" containerID="11b83dcd3b5441b2e4b7e4572e10a3322a09d12bdc5edb40a889ee5ba430e389" exitCode=0 Mar 14 09:37:10 crc kubenswrapper[4956]: I0314 09:37:10.415652 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" event={"ID":"75517d04-8feb-4d02-8b05-1f6fed48f03d","Type":"ContainerDied","Data":"11b83dcd3b5441b2e4b7e4572e10a3322a09d12bdc5edb40a889ee5ba430e389"} Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.811151 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.917508 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-config-data\") pod \"75517d04-8feb-4d02-8b05-1f6fed48f03d\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.917617 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsxbq\" (UniqueName: \"kubernetes.io/projected/75517d04-8feb-4d02-8b05-1f6fed48f03d-kube-api-access-fsxbq\") pod \"75517d04-8feb-4d02-8b05-1f6fed48f03d\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.917754 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-combined-ca-bundle\") pod \"75517d04-8feb-4d02-8b05-1f6fed48f03d\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.917828 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-db-sync-config-data\") pod \"75517d04-8feb-4d02-8b05-1f6fed48f03d\" (UID: \"75517d04-8feb-4d02-8b05-1f6fed48f03d\") " Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.924989 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75517d04-8feb-4d02-8b05-1f6fed48f03d-kube-api-access-fsxbq" (OuterVolumeSpecName: "kube-api-access-fsxbq") pod "75517d04-8feb-4d02-8b05-1f6fed48f03d" (UID: "75517d04-8feb-4d02-8b05-1f6fed48f03d"). InnerVolumeSpecName "kube-api-access-fsxbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.928230 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "75517d04-8feb-4d02-8b05-1f6fed48f03d" (UID: "75517d04-8feb-4d02-8b05-1f6fed48f03d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.949519 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75517d04-8feb-4d02-8b05-1f6fed48f03d" (UID: "75517d04-8feb-4d02-8b05-1f6fed48f03d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:11 crc kubenswrapper[4956]: I0314 09:37:11.959819 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-config-data" (OuterVolumeSpecName: "config-data") pod "75517d04-8feb-4d02-8b05-1f6fed48f03d" (UID: "75517d04-8feb-4d02-8b05-1f6fed48f03d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.021067 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.021112 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.021126 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75517d04-8feb-4d02-8b05-1f6fed48f03d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.021138 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsxbq\" (UniqueName: \"kubernetes.io/projected/75517d04-8feb-4d02-8b05-1f6fed48f03d-kube-api-access-fsxbq\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.433689 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" event={"ID":"75517d04-8feb-4d02-8b05-1f6fed48f03d","Type":"ContainerDied","Data":"af7fbe16784ae61faaf5466dfbe471bb54dc01a104e44dab1501f3da8e519644"} Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.433723 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7fbe16784ae61faaf5466dfbe471bb54dc01a104e44dab1501f3da8e519644" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.433776 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-46924" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.681608 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:12 crc kubenswrapper[4956]: E0314 09:37:12.682096 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75517d04-8feb-4d02-8b05-1f6fed48f03d" containerName="watcher-kuttl-db-sync" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.682126 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="75517d04-8feb-4d02-8b05-1f6fed48f03d" containerName="watcher-kuttl-db-sync" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.682313 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="75517d04-8feb-4d02-8b05-1f6fed48f03d" containerName="watcher-kuttl-db-sync" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.683156 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.686304 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-6xvhr" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.686326 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.705058 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.733863 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b48mf\" (UniqueName: \"kubernetes.io/projected/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-kube-api-access-b48mf\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.733910 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.733965 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.733996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.734038 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.835323 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.835388 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.835437 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.835499 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b48mf\" (UniqueName: \"kubernetes.io/projected/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-kube-api-access-b48mf\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.835527 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.836040 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.842734 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.843980 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.855579 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.857136 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b48mf\" (UniqueName: \"kubernetes.io/projected/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-kube-api-access-b48mf\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.891815 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.894924 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.903525 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.926170 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.936660 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.936767 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6c367c-6fde-474a-a698-67aacdc8c650-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.936819 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.938277 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlx5\" (UniqueName: \"kubernetes.io/projected/6d6c367c-6fde-474a-a698-67aacdc8c650-kube-api-access-fjlx5\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.950019 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.952136 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.967041 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.967291 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:37:12 crc kubenswrapper[4956]: I0314 09:37:12.967526 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:12.999946 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.011917 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.040394 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbsz\" (UniqueName: \"kubernetes.io/projected/b758b526-d58f-4990-b10a-4f0672f5e63f-kube-api-access-ztbsz\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.040657 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6c367c-6fde-474a-a698-67aacdc8c650-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.040756 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.040828 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.040920 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.040987 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.041055 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlx5\" (UniqueName: \"kubernetes.io/projected/6d6c367c-6fde-474a-a698-67aacdc8c650-kube-api-access-fjlx5\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.041120 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.041201 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b758b526-d58f-4990-b10a-4f0672f5e63f-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.041305 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.041358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6c367c-6fde-474a-a698-67aacdc8c650-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.041394 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.055232 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.055672 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.070110 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlx5\" (UniqueName: \"kubernetes.io/projected/6d6c367c-6fde-474a-a698-67aacdc8c650-kube-api-access-fjlx5\") pod \"watcher-kuttl-applier-0\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.144958 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b758b526-d58f-4990-b10a-4f0672f5e63f-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.145042 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.145123 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbsz\" (UniqueName: \"kubernetes.io/projected/b758b526-d58f-4990-b10a-4f0672f5e63f-kube-api-access-ztbsz\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.145184 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.145212 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.145240 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.145276 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.147286 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b758b526-d58f-4990-b10a-4f0672f5e63f-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.158654 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.161116 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.163166 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.169026 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.169506 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbsz\" (UniqueName: \"kubernetes.io/projected/b758b526-d58f-4990-b10a-4f0672f5e63f-kube-api-access-ztbsz\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.170872 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.241144 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.276743 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.565447 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:13 crc kubenswrapper[4956]: W0314 09:37:13.569423 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ae562ee_21b4_4af0_bbf7_3d9df6e6b890.slice/crio-f28c032ba1bf7d5b3e19b5d708fcd4bc59c586ee40f5b4de705bc543d223c9a8 WatchSource:0}: Error finding container f28c032ba1bf7d5b3e19b5d708fcd4bc59c586ee40f5b4de705bc543d223c9a8: Status 404 returned error can't find the container with id f28c032ba1bf7d5b3e19b5d708fcd4bc59c586ee40f5b4de705bc543d223c9a8 Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.747742 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:13 crc kubenswrapper[4956]: W0314 09:37:13.751425 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d6c367c_6fde_474a_a698_67aacdc8c650.slice/crio-ae881add9e006a10dfa7db2e20bc9249d902bd560c01de3dbc5e96d7932ddd5d WatchSource:0}: Error finding container ae881add9e006a10dfa7db2e20bc9249d902bd560c01de3dbc5e96d7932ddd5d: Status 404 returned error can't find the container with id ae881add9e006a10dfa7db2e20bc9249d902bd560c01de3dbc5e96d7932ddd5d Mar 14 09:37:13 crc kubenswrapper[4956]: I0314 09:37:13.831153 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.465452 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6d6c367c-6fde-474a-a698-67aacdc8c650","Type":"ContainerStarted","Data":"84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.465871 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6d6c367c-6fde-474a-a698-67aacdc8c650","Type":"ContainerStarted","Data":"ae881add9e006a10dfa7db2e20bc9249d902bd560c01de3dbc5e96d7932ddd5d"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.473517 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890","Type":"ContainerStarted","Data":"07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.473557 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890","Type":"ContainerStarted","Data":"f28c032ba1bf7d5b3e19b5d708fcd4bc59c586ee40f5b4de705bc543d223c9a8"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.478717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b758b526-d58f-4990-b10a-4f0672f5e63f","Type":"ContainerStarted","Data":"6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.478783 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b758b526-d58f-4990-b10a-4f0672f5e63f","Type":"ContainerStarted","Data":"7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.478800 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b758b526-d58f-4990-b10a-4f0672f5e63f","Type":"ContainerStarted","Data":"e62c7fd0bc66ed80a85e4e16cef8573b4a59d5c194bde616c2018eaf794c9a6e"} Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.480341 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.504156 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.504130934 podStartE2EDuration="2.504130934s" podCreationTimestamp="2026-03-14 09:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:14.48275339 +0000 UTC m=+2439.995445668" watchObservedRunningTime="2026-03-14 09:37:14.504130934 +0000 UTC m=+2440.016823202" Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.510327 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.510307331 podStartE2EDuration="2.510307331s" podCreationTimestamp="2026-03-14 09:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:14.509753267 +0000 UTC m=+2440.022445545" watchObservedRunningTime="2026-03-14 09:37:14.510307331 +0000 UTC m=+2440.022999599" Mar 14 09:37:14 crc kubenswrapper[4956]: I0314 09:37:14.537807 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.5377903 podStartE2EDuration="2.5377903s" podCreationTimestamp="2026-03-14 09:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:14.535171144 +0000 UTC m=+2440.047863422" watchObservedRunningTime="2026-03-14 09:37:14.5377903 +0000 UTC m=+2440.050482568" Mar 14 09:37:16 crc kubenswrapper[4956]: I0314 09:37:16.493416 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 09:37:16 crc kubenswrapper[4956]: I0314 09:37:16.920252 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:18 crc kubenswrapper[4956]: I0314 09:37:18.241981 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:18 crc kubenswrapper[4956]: I0314 09:37:18.278455 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:20 crc kubenswrapper[4956]: I0314 09:37:20.210131 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:37:20 crc kubenswrapper[4956]: E0314 09:37:20.212220 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.013604 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.040518 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.242394 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.273553 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.277583 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.292654 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.556444 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.565569 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.580327 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:23 crc kubenswrapper[4956]: I0314 09:37:23.580639 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:25 crc kubenswrapper[4956]: I0314 09:37:25.614274 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:25 crc kubenswrapper[4956]: I0314 09:37:25.615897 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-central-agent" containerID="cri-o://af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0" gracePeriod=30 Mar 14 09:37:25 crc kubenswrapper[4956]: I0314 09:37:25.616250 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="sg-core" containerID="cri-o://f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349" gracePeriod=30 Mar 14 09:37:25 crc kubenswrapper[4956]: I0314 09:37:25.616276 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="proxy-httpd" containerID="cri-o://f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90" gracePeriod=30 Mar 14 09:37:25 crc kubenswrapper[4956]: I0314 09:37:25.616300 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-notification-agent" containerID="cri-o://4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36" gracePeriod=30 Mar 14 09:37:25 crc kubenswrapper[4956]: I0314 09:37:25.632334 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.188:3000/\": EOF" Mar 14 09:37:26 crc kubenswrapper[4956]: I0314 09:37:26.581884 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerID="f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90" exitCode=0 Mar 14 09:37:26 crc kubenswrapper[4956]: I0314 09:37:26.581915 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerID="f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349" exitCode=2 Mar 14 09:37:26 crc kubenswrapper[4956]: I0314 09:37:26.581923 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerID="af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0" exitCode=0 Mar 14 09:37:26 crc kubenswrapper[4956]: I0314 09:37:26.581939 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerDied","Data":"f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90"} Mar 14 09:37:26 crc kubenswrapper[4956]: I0314 09:37:26.581962 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerDied","Data":"f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349"} Mar 14 09:37:26 crc kubenswrapper[4956]: I0314 09:37:26.581971 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerDied","Data":"af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0"} Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.511940 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.598605 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerID="4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36" exitCode=0 Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.598666 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerDied","Data":"4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36"} Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.598686 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.598712 4956 scope.go:117] "RemoveContainer" containerID="f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.598699 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f6925581-38d4-4d3e-b270-2924e26ec93f","Type":"ContainerDied","Data":"a99ada92801949f65a4b7b3a4694c5d975e2a9cf24348332944da323a87992c0"} Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.613731 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-sg-core-conf-yaml\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.613777 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-run-httpd\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.613811 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-log-httpd\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.613838 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdkwc\" (UniqueName: \"kubernetes.io/projected/f6925581-38d4-4d3e-b270-2924e26ec93f-kube-api-access-xdkwc\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.614173 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.614287 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.614499 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-ceilometer-tls-certs\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.614546 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-scripts\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.614590 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-config-data\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.614661 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-combined-ca-bundle\") pod \"f6925581-38d4-4d3e-b270-2924e26ec93f\" (UID: \"f6925581-38d4-4d3e-b270-2924e26ec93f\") " Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.615335 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.615358 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6925581-38d4-4d3e-b270-2924e26ec93f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.617756 4956 scope.go:117] "RemoveContainer" containerID="f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.619943 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6925581-38d4-4d3e-b270-2924e26ec93f-kube-api-access-xdkwc" (OuterVolumeSpecName: "kube-api-access-xdkwc") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "kube-api-access-xdkwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.621771 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-scripts" (OuterVolumeSpecName: "scripts") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.638745 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.674642 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.686105 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.705443 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-config-data" (OuterVolumeSpecName: "config-data") pod "f6925581-38d4-4d3e-b270-2924e26ec93f" (UID: "f6925581-38d4-4d3e-b270-2924e26ec93f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.711639 4956 scope.go:117] "RemoveContainer" containerID="4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.716661 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.716689 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.716699 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdkwc\" (UniqueName: \"kubernetes.io/projected/f6925581-38d4-4d3e-b270-2924e26ec93f-kube-api-access-xdkwc\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.716712 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.716724 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.716736 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6925581-38d4-4d3e-b270-2924e26ec93f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.731444 4956 scope.go:117] "RemoveContainer" containerID="af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.752765 4956 scope.go:117] "RemoveContainer" containerID="f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.753614 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90\": container with ID starting with f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90 not found: ID does not exist" containerID="f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.753670 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90"} err="failed to get container status \"f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90\": rpc error: code = NotFound desc = could not find container \"f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90\": container with ID starting with f580164ad60f3de6bf2743a9d64d8bfc01df0c0efb82726d89811de48f87ce90 not found: ID does not exist" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.753704 4956 scope.go:117] "RemoveContainer" containerID="f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.754073 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349\": container with ID starting with f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349 not found: ID does not exist" containerID="f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.754103 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349"} err="failed to get container status \"f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349\": rpc error: code = NotFound desc = could not find container \"f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349\": container with ID starting with f234e308a7aed0b1ddb81aab4ed35334273293dfc16fe37bdf64d19fcadbd349 not found: ID does not exist" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.754123 4956 scope.go:117] "RemoveContainer" containerID="4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.755028 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36\": container with ID starting with 4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36 not found: ID does not exist" containerID="4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.755077 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36"} err="failed to get container status \"4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36\": rpc error: code = NotFound desc = could not find container \"4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36\": container with ID starting with 4b83a6b77a8294344c487c091428982b0a2d4add0df4fefce98d52390e3c0f36 not found: ID does not exist" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.755106 4956 scope.go:117] "RemoveContainer" containerID="af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.755400 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0\": container with ID starting with af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0 not found: ID does not exist" containerID="af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.755426 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0"} err="failed to get container status \"af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0\": rpc error: code = NotFound desc = could not find container \"af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0\": container with ID starting with af836f6aeaddec36f487bc6f4e4ebfd78ba616c5b8512fa29140c4f0643265d0 not found: ID does not exist" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.944545 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.956408 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.968475 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.968871 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="proxy-httpd" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.968892 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="proxy-httpd" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.968907 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="sg-core" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.968917 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="sg-core" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.968939 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-notification-agent" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.968949 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-notification-agent" Mar 14 09:37:28 crc kubenswrapper[4956]: E0314 09:37:28.968976 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-central-agent" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.968984 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-central-agent" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.969166 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-notification-agent" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.969187 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="sg-core" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.969206 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="proxy-httpd" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.969217 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" containerName="ceilometer-central-agent" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.971098 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.974400 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.974840 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.975246 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:37:28 crc kubenswrapper[4956]: I0314 09:37:28.982002 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.025779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-config-data\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.025840 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-run-httpd\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.026000 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.026104 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.026164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.026197 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-scripts\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.026223 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmx2\" (UniqueName: \"kubernetes.io/projected/6732f3ac-41dd-4304-9058-fd30a7eb3f37-kube-api-access-dgmx2\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.026334 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-log-httpd\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128328 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128387 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128418 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-scripts\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128444 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmx2\" (UniqueName: \"kubernetes.io/projected/6732f3ac-41dd-4304-9058-fd30a7eb3f37-kube-api-access-dgmx2\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128475 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-log-httpd\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-config-data\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-run-httpd\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.128635 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.129741 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-log-httpd\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.129901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-run-httpd\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.132137 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.132280 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.133291 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-config-data\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.133566 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.135235 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-scripts\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.155163 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmx2\" (UniqueName: \"kubernetes.io/projected/6732f3ac-41dd-4304-9058-fd30a7eb3f37-kube-api-access-dgmx2\") pod \"ceilometer-0\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.236428 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6925581-38d4-4d3e-b270-2924e26ec93f" path="/var/lib/kubelet/pods/f6925581-38d4-4d3e-b270-2924e26ec93f/volumes" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.286091 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:29 crc kubenswrapper[4956]: I0314 09:37:29.741760 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:37:30 crc kubenswrapper[4956]: I0314 09:37:30.617690 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerStarted","Data":"8a6c82a88d310589e2c13639d8725404c296e45576e0941be14e1263b40dffcc"} Mar 14 09:37:30 crc kubenswrapper[4956]: I0314 09:37:30.618037 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerStarted","Data":"e0c765871f5043d8fad2a0e0eb695f9e36c6937c0b03c8d70c53c22f9bd8a548"} Mar 14 09:37:31 crc kubenswrapper[4956]: I0314 09:37:31.210532 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:37:31 crc kubenswrapper[4956]: E0314 09:37:31.211137 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:37:31 crc kubenswrapper[4956]: I0314 09:37:31.626753 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerStarted","Data":"043fece4492dd96929a18f91a3ed30ef7e9495e0ce541358bed56d7bf4f9ced0"} Mar 14 09:37:32 crc kubenswrapper[4956]: I0314 09:37:32.639704 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerStarted","Data":"85c78f73d76851ad60e66f3dd7318eee356254f09a7c201a5201d306d3f4c103"} Mar 14 09:37:34 crc kubenswrapper[4956]: I0314 09:37:34.671825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerStarted","Data":"8526826613a3132376d5ee31843e1927f8a239198f8f22204857ac298b732c9b"} Mar 14 09:37:34 crc kubenswrapper[4956]: I0314 09:37:34.672267 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:34 crc kubenswrapper[4956]: I0314 09:37:34.697706 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.804902221 podStartE2EDuration="6.697680501s" podCreationTimestamp="2026-03-14 09:37:28 +0000 UTC" firstStartedPulling="2026-03-14 09:37:29.745405821 +0000 UTC m=+2455.258098089" lastFinishedPulling="2026-03-14 09:37:33.638184101 +0000 UTC m=+2459.150876369" observedRunningTime="2026-03-14 09:37:34.689885363 +0000 UTC m=+2460.202577651" watchObservedRunningTime="2026-03-14 09:37:34.697680501 +0000 UTC m=+2460.210372769" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.059363 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.059831 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/memcached-0" podUID="d0b0e4c3-5f9a-4415-9440-f2758780999a" containerName="memcached" containerID="cri-o://40c00dcf4fdbf2fbc139da1d3144ae367b399901fa005550f10307fd6f6a5d41" gracePeriod=30 Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.135360 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.135620 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="6d6c367c-6fde-474a-a698-67aacdc8c650" containerName="watcher-applier" containerID="cri-o://84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e" gracePeriod=30 Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.150225 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.150526 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-kuttl-api-log" containerID="cri-o://7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642" gracePeriod=30 Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.150589 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-api" containerID="cri-o://6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef" gracePeriod=30 Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.163568 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.163821 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" containerName="watcher-decision-engine" containerID="cri-o://07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073" gracePeriod=30 Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.268178 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-tttxv"] Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.269984 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.276415 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-tttxv"] Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.285413 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-mtls" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.286363 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376619 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-combined-ca-bundle\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376709 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-cert-memcached-mtls\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376741 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2f8\" (UniqueName: \"kubernetes.io/projected/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-kube-api-access-ww2f8\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376789 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-scripts\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376859 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-config-data\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376916 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-credential-keys\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.376943 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-fernet-keys\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-config-data\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478678 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-credential-keys\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478708 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-fernet-keys\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478745 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-combined-ca-bundle\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478795 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-cert-memcached-mtls\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478819 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2f8\" (UniqueName: \"kubernetes.io/projected/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-kube-api-access-ww2f8\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.478868 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-scripts\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.485974 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-config-data\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.486742 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-combined-ca-bundle\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.487836 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-credential-keys\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.488181 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-scripts\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.488315 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-fernet-keys\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.489681 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-cert-memcached-mtls\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.504033 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2f8\" (UniqueName: \"kubernetes.io/projected/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-kube-api-access-ww2f8\") pod \"keystone-bootstrap-tttxv\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.620960 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.703859 4956 generic.go:334] "Generic (PLEG): container finished" podID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerID="7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642" exitCode=143 Mar 14 09:37:37 crc kubenswrapper[4956]: I0314 09:37:37.703930 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b758b526-d58f-4990-b10a-4f0672f5e63f","Type":"ContainerDied","Data":"7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642"} Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.118959 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-tttxv"] Mar 14 09:37:38 crc kubenswrapper[4956]: W0314 09:37:38.125230 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3721c946_1a7b_4a9d_8fb8_1455cfa063c8.slice/crio-522720b510410a5a58a126c95aa9e9f2291bdd8111750a49cc4ae7eb3e25854b WatchSource:0}: Error finding container 522720b510410a5a58a126c95aa9e9f2291bdd8111750a49cc4ae7eb3e25854b: Status 404 returned error can't find the container with id 522720b510410a5a58a126c95aa9e9f2291bdd8111750a49cc4ae7eb3e25854b Mar 14 09:37:38 crc kubenswrapper[4956]: E0314 09:37:38.244492 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:37:38 crc kubenswrapper[4956]: E0314 09:37:38.246691 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:37:38 crc kubenswrapper[4956]: E0314 09:37:38.248581 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:37:38 crc kubenswrapper[4956]: E0314 09:37:38.248641 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="6d6c367c-6fde-474a-a698-67aacdc8c650" containerName="watcher-applier" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.279051 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.194:9322/\": dial tcp 10.217.0.194:9322: connect: connection refused" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.279528 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.194:9322/\": dial tcp 10.217.0.194:9322: connect: connection refused" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.721643 4956 generic.go:334] "Generic (PLEG): container finished" podID="d0b0e4c3-5f9a-4415-9440-f2758780999a" containerID="40c00dcf4fdbf2fbc139da1d3144ae367b399901fa005550f10307fd6f6a5d41" exitCode=0 Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.721704 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"d0b0e4c3-5f9a-4415-9440-f2758780999a","Type":"ContainerDied","Data":"40c00dcf4fdbf2fbc139da1d3144ae367b399901fa005550f10307fd6f6a5d41"} Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.725199 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" event={"ID":"3721c946-1a7b-4a9d-8fb8-1455cfa063c8","Type":"ContainerStarted","Data":"9af1cd6069e07e614f66c814eff30be6f4937a9b522bee17ce0535aafb1b020b"} Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.725249 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" event={"ID":"3721c946-1a7b-4a9d-8fb8-1455cfa063c8","Type":"ContainerStarted","Data":"522720b510410a5a58a126c95aa9e9f2291bdd8111750a49cc4ae7eb3e25854b"} Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.727826 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.728800 4956 generic.go:334] "Generic (PLEG): container finished" podID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerID="6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef" exitCode=0 Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.728849 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b758b526-d58f-4990-b10a-4f0672f5e63f","Type":"ContainerDied","Data":"6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef"} Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.728879 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b758b526-d58f-4990-b10a-4f0672f5e63f","Type":"ContainerDied","Data":"e62c7fd0bc66ed80a85e4e16cef8573b4a59d5c194bde616c2018eaf794c9a6e"} Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.728901 4956 scope.go:117] "RemoveContainer" containerID="6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.754117 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" podStartSLOduration=1.754093971 podStartE2EDuration="1.754093971s" podCreationTimestamp="2026-03-14 09:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:38.743138524 +0000 UTC m=+2464.255830792" watchObservedRunningTime="2026-03-14 09:37:38.754093971 +0000 UTC m=+2464.266786239" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.760200 4956 scope.go:117] "RemoveContainer" containerID="7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.801759 4956 scope.go:117] "RemoveContainer" containerID="6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef" Mar 14 09:37:38 crc kubenswrapper[4956]: E0314 09:37:38.803303 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef\": container with ID starting with 6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef not found: ID does not exist" containerID="6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.803351 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef"} err="failed to get container status \"6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef\": rpc error: code = NotFound desc = could not find container \"6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef\": container with ID starting with 6a826d3d92b19a06560f7a3a5b16dc9eadbec01c3546249e32d9d82affcf38ef not found: ID does not exist" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.803379 4956 scope.go:117] "RemoveContainer" containerID="7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642" Mar 14 09:37:38 crc kubenswrapper[4956]: E0314 09:37:38.804282 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642\": container with ID starting with 7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642 not found: ID does not exist" containerID="7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.804308 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642"} err="failed to get container status \"7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642\": rpc error: code = NotFound desc = could not find container \"7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642\": container with ID starting with 7096b1d326d79cbb3b6f36361dd6ae7e021438d286ff3a91bcb470b7b8784642 not found: ID does not exist" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.811603 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b758b526-d58f-4990-b10a-4f0672f5e63f-logs\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.811713 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-combined-ca-bundle\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.811747 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztbsz\" (UniqueName: \"kubernetes.io/projected/b758b526-d58f-4990-b10a-4f0672f5e63f-kube-api-access-ztbsz\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.811820 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-config-data\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.811844 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-custom-prometheus-ca\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.811887 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-internal-tls-certs\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.812043 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-public-tls-certs\") pod \"b758b526-d58f-4990-b10a-4f0672f5e63f\" (UID: \"b758b526-d58f-4990-b10a-4f0672f5e63f\") " Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.812579 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b758b526-d58f-4990-b10a-4f0672f5e63f-logs" (OuterVolumeSpecName: "logs") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.827711 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b758b526-d58f-4990-b10a-4f0672f5e63f-kube-api-access-ztbsz" (OuterVolumeSpecName: "kube-api-access-ztbsz") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "kube-api-access-ztbsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.855405 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.864014 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.867847 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.886938 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-config-data" (OuterVolumeSpecName: "config-data") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.889269 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b758b526-d58f-4990-b10a-4f0672f5e63f" (UID: "b758b526-d58f-4990-b10a-4f0672f5e63f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913567 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913596 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913605 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913613 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b758b526-d58f-4990-b10a-4f0672f5e63f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913622 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913630 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztbsz\" (UniqueName: \"kubernetes.io/projected/b758b526-d58f-4990-b10a-4f0672f5e63f-kube-api-access-ztbsz\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.913639 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758b526-d58f-4990-b10a-4f0672f5e63f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:38 crc kubenswrapper[4956]: I0314 09:37:38.915427 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.434196 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-memcached-tls-certs\") pod \"d0b0e4c3-5f9a-4415-9440-f2758780999a\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.434271 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-combined-ca-bundle\") pod \"d0b0e4c3-5f9a-4415-9440-f2758780999a\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.434375 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gp4h\" (UniqueName: \"kubernetes.io/projected/d0b0e4c3-5f9a-4415-9440-f2758780999a-kube-api-access-2gp4h\") pod \"d0b0e4c3-5f9a-4415-9440-f2758780999a\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.434496 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-kolla-config\") pod \"d0b0e4c3-5f9a-4415-9440-f2758780999a\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.434763 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-config-data\") pod \"d0b0e4c3-5f9a-4415-9440-f2758780999a\" (UID: \"d0b0e4c3-5f9a-4415-9440-f2758780999a\") " Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.435682 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d0b0e4c3-5f9a-4415-9440-f2758780999a" (UID: "d0b0e4c3-5f9a-4415-9440-f2758780999a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.435776 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-config-data" (OuterVolumeSpecName: "config-data") pod "d0b0e4c3-5f9a-4415-9440-f2758780999a" (UID: "d0b0e4c3-5f9a-4415-9440-f2758780999a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.443211 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b0e4c3-5f9a-4415-9440-f2758780999a-kube-api-access-2gp4h" (OuterVolumeSpecName: "kube-api-access-2gp4h") pod "d0b0e4c3-5f9a-4415-9440-f2758780999a" (UID: "d0b0e4c3-5f9a-4415-9440-f2758780999a"). InnerVolumeSpecName "kube-api-access-2gp4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.462726 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b0e4c3-5f9a-4415-9440-f2758780999a" (UID: "d0b0e4c3-5f9a-4415-9440-f2758780999a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.489598 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "d0b0e4c3-5f9a-4415-9440-f2758780999a" (UID: "d0b0e4c3-5f9a-4415-9440-f2758780999a"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.536745 4956 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.536778 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0b0e4c3-5f9a-4415-9440-f2758780999a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.536788 4956 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.536798 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0e4c3-5f9a-4415-9440-f2758780999a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.536808 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gp4h\" (UniqueName: \"kubernetes.io/projected/d0b0e4c3-5f9a-4415-9440-f2758780999a-kube-api-access-2gp4h\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.739690 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.739914 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"d0b0e4c3-5f9a-4415-9440-f2758780999a","Type":"ContainerDied","Data":"3f321230ac8973d08651ef5ec007be3e363fb1f9240503348796de056e19a768"} Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.740112 4956 scope.go:117] "RemoveContainer" containerID="40c00dcf4fdbf2fbc139da1d3144ae367b399901fa005550f10307fd6f6a5d41" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.741213 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.776051 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.793634 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.812244 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.822564 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.833633 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: E0314 09:37:39.834055 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-kuttl-api-log" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.834070 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-kuttl-api-log" Mar 14 09:37:39 crc kubenswrapper[4956]: E0314 09:37:39.834099 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0e4c3-5f9a-4415-9440-f2758780999a" containerName="memcached" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.834105 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0e4c3-5f9a-4415-9440-f2758780999a" containerName="memcached" Mar 14 09:37:39 crc kubenswrapper[4956]: E0314 09:37:39.834114 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-api" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.834120 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-api" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.834251 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-api" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.834263 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0e4c3-5f9a-4415-9440-f2758780999a" containerName="memcached" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.834277 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" containerName="watcher-kuttl-api-log" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.835145 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.837289 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.837289 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.842263 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.843468 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.844435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.850470 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.851535 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-92tvb" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.851986 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.852120 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.858941 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942023 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b78ced78-9d78-4bed-ab13-0f14be6bcb47-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942105 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzvx\" (UniqueName: \"kubernetes.io/projected/b78ced78-9d78-4bed-ab13-0f14be6bcb47-kube-api-access-gpzvx\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942238 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942337 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942464 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942525 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:39 crc kubenswrapper[4956]: I0314 09:37:39.942556 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048679 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2973e91c-c0d8-4a9c-871e-0147ecc86297-config-data\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048739 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048766 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2973e91c-c0d8-4a9c-871e-0147ecc86297-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048801 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b78ced78-9d78-4bed-ab13-0f14be6bcb47-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048837 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzvx\" (UniqueName: \"kubernetes.io/projected/b78ced78-9d78-4bed-ab13-0f14be6bcb47-kube-api-access-gpzvx\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048878 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048933 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048967 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2973e91c-c0d8-4a9c-871e-0147ecc86297-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.048987 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpzbt\" (UniqueName: \"kubernetes.io/projected/2973e91c-c0d8-4a9c-871e-0147ecc86297-kube-api-access-kpzbt\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.049012 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2973e91c-c0d8-4a9c-871e-0147ecc86297-kolla-config\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.049036 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.049060 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.049076 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.051680 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b78ced78-9d78-4bed-ab13-0f14be6bcb47-logs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.054627 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.054726 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.056294 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.056724 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.071563 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.071757 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.076361 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzvx\" (UniqueName: \"kubernetes.io/projected/b78ced78-9d78-4bed-ab13-0f14be6bcb47-kube-api-access-gpzvx\") pod \"watcher-kuttl-api-0\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.151181 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2973e91c-c0d8-4a9c-871e-0147ecc86297-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.151634 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpzbt\" (UniqueName: \"kubernetes.io/projected/2973e91c-c0d8-4a9c-871e-0147ecc86297-kube-api-access-kpzbt\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.151661 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2973e91c-c0d8-4a9c-871e-0147ecc86297-kolla-config\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.151693 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2973e91c-c0d8-4a9c-871e-0147ecc86297-config-data\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.151713 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2973e91c-c0d8-4a9c-871e-0147ecc86297-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.152574 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2973e91c-c0d8-4a9c-871e-0147ecc86297-kolla-config\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.152597 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2973e91c-c0d8-4a9c-871e-0147ecc86297-config-data\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.154962 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2973e91c-c0d8-4a9c-871e-0147ecc86297-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.155385 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.164619 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2973e91c-c0d8-4a9c-871e-0147ecc86297-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.166811 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpzbt\" (UniqueName: \"kubernetes.io/projected/2973e91c-c0d8-4a9c-871e-0147ecc86297-kube-api-access-kpzbt\") pod \"memcached-0\" (UID: \"2973e91c-c0d8-4a9c-871e-0147ecc86297\") " pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.467909 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.747081 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:40 crc kubenswrapper[4956]: W0314 09:37:40.751597 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb78ced78_9d78_4bed_ab13_0f14be6bcb47.slice/crio-886639d4f8fe5e0998dc9ecafbffb534acca3866e3a1b43c8c2e5b11cd8bf810 WatchSource:0}: Error finding container 886639d4f8fe5e0998dc9ecafbffb534acca3866e3a1b43c8c2e5b11cd8bf810: Status 404 returned error can't find the container with id 886639d4f8fe5e0998dc9ecafbffb534acca3866e3a1b43c8c2e5b11cd8bf810 Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.760232 4956 generic.go:334] "Generic (PLEG): container finished" podID="6d6c367c-6fde-474a-a698-67aacdc8c650" containerID="84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e" exitCode=0 Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.760265 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6d6c367c-6fde-474a-a698-67aacdc8c650","Type":"ContainerDied","Data":"84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e"} Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.859791 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.873238 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-config-data\") pod \"6d6c367c-6fde-474a-a698-67aacdc8c650\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.876136 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6c367c-6fde-474a-a698-67aacdc8c650-logs\") pod \"6d6c367c-6fde-474a-a698-67aacdc8c650\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.876223 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-combined-ca-bundle\") pod \"6d6c367c-6fde-474a-a698-67aacdc8c650\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.876283 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjlx5\" (UniqueName: \"kubernetes.io/projected/6d6c367c-6fde-474a-a698-67aacdc8c650-kube-api-access-fjlx5\") pod \"6d6c367c-6fde-474a-a698-67aacdc8c650\" (UID: \"6d6c367c-6fde-474a-a698-67aacdc8c650\") " Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.878252 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d6c367c-6fde-474a-a698-67aacdc8c650-logs" (OuterVolumeSpecName: "logs") pod "6d6c367c-6fde-474a-a698-67aacdc8c650" (UID: "6d6c367c-6fde-474a-a698-67aacdc8c650"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.883496 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6c367c-6fde-474a-a698-67aacdc8c650-kube-api-access-fjlx5" (OuterVolumeSpecName: "kube-api-access-fjlx5") pod "6d6c367c-6fde-474a-a698-67aacdc8c650" (UID: "6d6c367c-6fde-474a-a698-67aacdc8c650"). InnerVolumeSpecName "kube-api-access-fjlx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.908825 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d6c367c-6fde-474a-a698-67aacdc8c650" (UID: "6d6c367c-6fde-474a-a698-67aacdc8c650"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.930323 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-config-data" (OuterVolumeSpecName: "config-data") pod "6d6c367c-6fde-474a-a698-67aacdc8c650" (UID: "6d6c367c-6fde-474a-a698-67aacdc8c650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.963300 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Mar 14 09:37:40 crc kubenswrapper[4956]: W0314 09:37:40.966775 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2973e91c_c0d8_4a9c_871e_0147ecc86297.slice/crio-4d5a77235c4b2607fe9c649f38653e154639248d2443b5181b11c6d656a321c7 WatchSource:0}: Error finding container 4d5a77235c4b2607fe9c649f38653e154639248d2443b5181b11c6d656a321c7: Status 404 returned error can't find the container with id 4d5a77235c4b2607fe9c649f38653e154639248d2443b5181b11c6d656a321c7 Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.977543 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6c367c-6fde-474a-a698-67aacdc8c650-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.977575 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.977587 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjlx5\" (UniqueName: \"kubernetes.io/projected/6d6c367c-6fde-474a-a698-67aacdc8c650-kube-api-access-fjlx5\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:40 crc kubenswrapper[4956]: I0314 09:37:40.977596 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6c367c-6fde-474a-a698-67aacdc8c650-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.221365 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b758b526-d58f-4990-b10a-4f0672f5e63f" path="/var/lib/kubelet/pods/b758b526-d58f-4990-b10a-4f0672f5e63f/volumes" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.222223 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b0e4c3-5f9a-4415-9440-f2758780999a" path="/var/lib/kubelet/pods/d0b0e4c3-5f9a-4415-9440-f2758780999a/volumes" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.771936 4956 generic.go:334] "Generic (PLEG): container finished" podID="3721c946-1a7b-4a9d-8fb8-1455cfa063c8" containerID="9af1cd6069e07e614f66c814eff30be6f4937a9b522bee17ce0535aafb1b020b" exitCode=0 Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.772013 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" event={"ID":"3721c946-1a7b-4a9d-8fb8-1455cfa063c8","Type":"ContainerDied","Data":"9af1cd6069e07e614f66c814eff30be6f4937a9b522bee17ce0535aafb1b020b"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.774442 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"2973e91c-c0d8-4a9c-871e-0147ecc86297","Type":"ContainerStarted","Data":"50daf5198998bc83d2aeed93dd36b7013c3a5a7d6e6d84a0df506e26a4598a1a"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.774508 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"2973e91c-c0d8-4a9c-871e-0147ecc86297","Type":"ContainerStarted","Data":"4d5a77235c4b2607fe9c649f38653e154639248d2443b5181b11c6d656a321c7"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.774608 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.776113 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6d6c367c-6fde-474a-a698-67aacdc8c650","Type":"ContainerDied","Data":"ae881add9e006a10dfa7db2e20bc9249d902bd560c01de3dbc5e96d7932ddd5d"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.776164 4956 scope.go:117] "RemoveContainer" containerID="84065e36fa76f0b1585c2c09ad11f36b3d53bfb7d0a6f0856105fd0f8a76ac6e" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.776299 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.778836 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b78ced78-9d78-4bed-ab13-0f14be6bcb47","Type":"ContainerStarted","Data":"adead90c8f7ee235deb6101678c78fecf006b3bea9c5a1f20b661356371de4f1"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.778881 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b78ced78-9d78-4bed-ab13-0f14be6bcb47","Type":"ContainerStarted","Data":"73884d6b4bb61fd74a2c93360dae3a9c83e30cd6d8b7095899833a71ae09c38d"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.778899 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b78ced78-9d78-4bed-ab13-0f14be6bcb47","Type":"ContainerStarted","Data":"886639d4f8fe5e0998dc9ecafbffb534acca3866e3a1b43c8c2e5b11cd8bf810"} Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.779924 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.831436 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.8314126809999998 podStartE2EDuration="2.831412681s" podCreationTimestamp="2026-03-14 09:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:41.822892075 +0000 UTC m=+2467.335584353" watchObservedRunningTime="2026-03-14 09:37:41.831412681 +0000 UTC m=+2467.344104949" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.849165 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.855849 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.872508 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.87246082 podStartE2EDuration="2.87246082s" podCreationTimestamp="2026-03-14 09:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:41.859848461 +0000 UTC m=+2467.372540719" watchObservedRunningTime="2026-03-14 09:37:41.87246082 +0000 UTC m=+2467.385153098" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.884096 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:41 crc kubenswrapper[4956]: E0314 09:37:41.884521 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6c367c-6fde-474a-a698-67aacdc8c650" containerName="watcher-applier" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.884540 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6c367c-6fde-474a-a698-67aacdc8c650" containerName="watcher-applier" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.884735 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6c367c-6fde-474a-a698-67aacdc8c650" containerName="watcher-applier" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.885365 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.888308 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.891580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ng94\" (UniqueName: \"kubernetes.io/projected/48177e62-e16a-4975-925f-a3471fb5580b-kube-api-access-2ng94\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.891665 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.891706 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.891829 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.891881 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48177e62-e16a-4975-925f-a3471fb5580b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.893652 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.993574 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ng94\" (UniqueName: \"kubernetes.io/projected/48177e62-e16a-4975-925f-a3471fb5580b-kube-api-access-2ng94\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.993640 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.993671 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.993762 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.993798 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48177e62-e16a-4975-925f-a3471fb5580b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:41 crc kubenswrapper[4956]: I0314 09:37:41.994393 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48177e62-e16a-4975-925f-a3471fb5580b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.001092 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.005995 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.012157 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ng94\" (UniqueName: \"kubernetes.io/projected/48177e62-e16a-4975-925f-a3471fb5580b-kube-api-access-2ng94\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.020140 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.208338 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:42 crc kubenswrapper[4956]: E0314 09:37:42.560534 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:60294->38.102.83.32:40407: write tcp 38.102.83.32:60294->38.102.83.32:40407: write: broken pipe Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.654013 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:37:42 crc kubenswrapper[4956]: W0314 09:37:42.664439 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48177e62_e16a_4975_925f_a3471fb5580b.slice/crio-bdc5fb76834f7f31a00a1c40c4c18e33db1f99081290ca3d316c1c068c5957da WatchSource:0}: Error finding container bdc5fb76834f7f31a00a1c40c4c18e33db1f99081290ca3d316c1c068c5957da: Status 404 returned error can't find the container with id bdc5fb76834f7f31a00a1c40c4c18e33db1f99081290ca3d316c1c068c5957da Mar 14 09:37:42 crc kubenswrapper[4956]: I0314 09:37:42.796045 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"48177e62-e16a-4975-925f-a3471fb5580b","Type":"ContainerStarted","Data":"bdc5fb76834f7f31a00a1c40c4c18e33db1f99081290ca3d316c1c068c5957da"} Mar 14 09:37:43 crc kubenswrapper[4956]: E0314 09:37:43.015979 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 09:37:43 crc kubenswrapper[4956]: E0314 09:37:43.017508 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 09:37:43 crc kubenswrapper[4956]: E0314 09:37:43.020043 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 09:37:43 crc kubenswrapper[4956]: E0314 09:37:43.020125 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" containerName="watcher-decision-engine" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.114630 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136597 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-scripts\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136638 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww2f8\" (UniqueName: \"kubernetes.io/projected/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-kube-api-access-ww2f8\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136715 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-fernet-keys\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136763 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-credential-keys\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136821 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-combined-ca-bundle\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136849 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-config-data\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.136909 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-cert-memcached-mtls\") pod \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\" (UID: \"3721c946-1a7b-4a9d-8fb8-1455cfa063c8\") " Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.147665 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-kube-api-access-ww2f8" (OuterVolumeSpecName: "kube-api-access-ww2f8") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "kube-api-access-ww2f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.151268 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-scripts" (OuterVolumeSpecName: "scripts") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.157983 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.158124 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.176476 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.177158 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-config-data" (OuterVolumeSpecName: "config-data") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.206631 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "3721c946-1a7b-4a9d-8fb8-1455cfa063c8" (UID: "3721c946-1a7b-4a9d-8fb8-1455cfa063c8"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.214218 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:37:43 crc kubenswrapper[4956]: E0314 09:37:43.214607 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.220996 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6c367c-6fde-474a-a698-67aacdc8c650" path="/var/lib/kubelet/pods/6d6c367c-6fde-474a-a698-67aacdc8c650/volumes" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.238969 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.238994 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.239003 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.239011 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.239039 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww2f8\" (UniqueName: \"kubernetes.io/projected/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-kube-api-access-ww2f8\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.239049 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.239056 4956 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3721c946-1a7b-4a9d-8fb8-1455cfa063c8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.808846 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"48177e62-e16a-4975-925f-a3471fb5580b","Type":"ContainerStarted","Data":"a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68"} Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.810340 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.810372 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" event={"ID":"3721c946-1a7b-4a9d-8fb8-1455cfa063c8","Type":"ContainerDied","Data":"522720b510410a5a58a126c95aa9e9f2291bdd8111750a49cc4ae7eb3e25854b"} Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.810417 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="522720b510410a5a58a126c95aa9e9f2291bdd8111750a49cc4ae7eb3e25854b" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.810358 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-tttxv" Mar 14 09:37:43 crc kubenswrapper[4956]: I0314 09:37:43.833334 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.833318801 podStartE2EDuration="2.833318801s" podCreationTimestamp="2026-03-14 09:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:43.827927385 +0000 UTC m=+2469.340619653" watchObservedRunningTime="2026-03-14 09:37:43.833318801 +0000 UTC m=+2469.346011069" Mar 14 09:37:44 crc kubenswrapper[4956]: I0314 09:37:44.017564 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:45 crc kubenswrapper[4956]: I0314 09:37:45.155694 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:45 crc kubenswrapper[4956]: I0314 09:37:45.832437 4956 generic.go:334] "Generic (PLEG): container finished" podID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" containerID="07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073" exitCode=0 Mar 14 09:37:45 crc kubenswrapper[4956]: I0314 09:37:45.832518 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890","Type":"ContainerDied","Data":"07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073"} Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.112122 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.289946 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-custom-prometheus-ca\") pod \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.290061 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-config-data\") pod \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.290110 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b48mf\" (UniqueName: \"kubernetes.io/projected/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-kube-api-access-b48mf\") pod \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.290323 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-combined-ca-bundle\") pod \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.290446 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-logs\") pod \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\" (UID: \"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890\") " Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.291082 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-logs" (OuterVolumeSpecName: "logs") pod "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" (UID: "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.295841 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-kube-api-access-b48mf" (OuterVolumeSpecName: "kube-api-access-b48mf") pod "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" (UID: "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890"). InnerVolumeSpecName "kube-api-access-b48mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.311183 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" (UID: "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.315896 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" (UID: "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.339166 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-config-data" (OuterVolumeSpecName: "config-data") pod "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" (UID: "3ae562ee-21b4-4af0-bbf7-3d9df6e6b890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.391927 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.392151 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.392236 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b48mf\" (UniqueName: \"kubernetes.io/projected/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-kube-api-access-b48mf\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.392313 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.392376 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.845648 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3ae562ee-21b4-4af0-bbf7-3d9df6e6b890","Type":"ContainerDied","Data":"f28c032ba1bf7d5b3e19b5d708fcd4bc59c586ee40f5b4de705bc543d223c9a8"} Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.845704 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.845726 4956 scope.go:117] "RemoveContainer" containerID="07b270526837b00a7168e2ebd440ba9db5624dd53d5cfde4467aaadb54dee073" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.893477 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.908020 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.922958 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:46 crc kubenswrapper[4956]: E0314 09:37:46.923606 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3721c946-1a7b-4a9d-8fb8-1455cfa063c8" containerName="keystone-bootstrap" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.923631 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3721c946-1a7b-4a9d-8fb8-1455cfa063c8" containerName="keystone-bootstrap" Mar 14 09:37:46 crc kubenswrapper[4956]: E0314 09:37:46.923673 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" containerName="watcher-decision-engine" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.923688 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" containerName="watcher-decision-engine" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.923975 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" containerName="watcher-decision-engine" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.924006 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3721c946-1a7b-4a9d-8fb8-1455cfa063c8" containerName="keystone-bootstrap" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.925072 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.927961 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:37:46 crc kubenswrapper[4956]: I0314 09:37:46.935796 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.002575 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.002626 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.002645 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.002666 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.002729 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.002787 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55x4k\" (UniqueName: \"kubernetes.io/projected/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-kube-api-access-55x4k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.104169 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55x4k\" (UniqueName: \"kubernetes.io/projected/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-kube-api-access-55x4k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.104563 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.104618 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.104652 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.104693 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.104754 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.105673 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.109730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.110505 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.111173 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.111333 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.122969 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55x4k\" (UniqueName: \"kubernetes.io/projected/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-kube-api-access-55x4k\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.221938 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae562ee-21b4-4af0-bbf7-3d9df6e6b890" path="/var/lib/kubelet/pods/3ae562ee-21b4-4af0-bbf7-3d9df6e6b890/volumes" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.222797 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.242949 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.700777 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:37:47 crc kubenswrapper[4956]: I0314 09:37:47.858803 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1","Type":"ContainerStarted","Data":"4fce818e2507373959d5dcc2f177ea77e721ab6c32299e9c80307f4760cf48d1"} Mar 14 09:37:48 crc kubenswrapper[4956]: I0314 09:37:48.868275 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1","Type":"ContainerStarted","Data":"e92f523176f348d2dbc52c610445d988896cfae9132e8df8ac13fb6c0498ea7f"} Mar 14 09:37:48 crc kubenswrapper[4956]: I0314 09:37:48.898951 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.898932936 podStartE2EDuration="2.898932936s" podCreationTimestamp="2026-03-14 09:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:48.892686897 +0000 UTC m=+2474.405379165" watchObservedRunningTime="2026-03-14 09:37:48.898932936 +0000 UTC m=+2474.411625464" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.156334 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.169716 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.472356 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.671671 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-75c5988f99-28ht4"] Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.672697 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.688947 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-75c5988f99-28ht4"] Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861170 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-internal-tls-certs\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861283 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-public-tls-certs\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861355 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-credential-keys\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861405 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-config-data\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861517 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-scripts\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861562 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-cert-memcached-mtls\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861637 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpdch\" (UniqueName: \"kubernetes.io/projected/67a35faf-96ec-45eb-af07-2a248bca71e2-kube-api-access-vpdch\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861698 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-fernet-keys\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.861753 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-combined-ca-bundle\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.893694 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.962770 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-internal-tls-certs\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.963895 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-public-tls-certs\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964108 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-credential-keys\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-config-data\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964346 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-scripts\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-cert-memcached-mtls\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964582 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpdch\" (UniqueName: \"kubernetes.io/projected/67a35faf-96ec-45eb-af07-2a248bca71e2-kube-api-access-vpdch\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964656 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-fernet-keys\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.964800 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-combined-ca-bundle\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.970021 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-public-tls-certs\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.973131 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-internal-tls-certs\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.973144 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-fernet-keys\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.974125 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-cert-memcached-mtls\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.974382 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-config-data\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.974466 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-combined-ca-bundle\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.988919 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpdch\" (UniqueName: \"kubernetes.io/projected/67a35faf-96ec-45eb-af07-2a248bca71e2-kube-api-access-vpdch\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.989313 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-credential-keys\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:50 crc kubenswrapper[4956]: I0314 09:37:50.991459 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67a35faf-96ec-45eb-af07-2a248bca71e2-scripts\") pod \"keystone-75c5988f99-28ht4\" (UID: \"67a35faf-96ec-45eb-af07-2a248bca71e2\") " pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:51 crc kubenswrapper[4956]: I0314 09:37:51.009919 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:51 crc kubenswrapper[4956]: W0314 09:37:51.443099 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a35faf_96ec_45eb_af07_2a248bca71e2.slice/crio-b2f2c5b8f5f56e62b942c1dbbbad172ce1040a67df0314fe8b9f0f3ce5694f1d WatchSource:0}: Error finding container b2f2c5b8f5f56e62b942c1dbbbad172ce1040a67df0314fe8b9f0f3ce5694f1d: Status 404 returned error can't find the container with id b2f2c5b8f5f56e62b942c1dbbbad172ce1040a67df0314fe8b9f0f3ce5694f1d Mar 14 09:37:51 crc kubenswrapper[4956]: I0314 09:37:51.445970 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-75c5988f99-28ht4"] Mar 14 09:37:51 crc kubenswrapper[4956]: I0314 09:37:51.893314 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" event={"ID":"67a35faf-96ec-45eb-af07-2a248bca71e2","Type":"ContainerStarted","Data":"219bab3be4f0ea2b334aa0a6b47cf15acfddbd4a093504125f34709c0cf0232f"} Mar 14 09:37:51 crc kubenswrapper[4956]: I0314 09:37:51.893749 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:37:51 crc kubenswrapper[4956]: I0314 09:37:51.893763 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" event={"ID":"67a35faf-96ec-45eb-af07-2a248bca71e2","Type":"ContainerStarted","Data":"b2f2c5b8f5f56e62b942c1dbbbad172ce1040a67df0314fe8b9f0f3ce5694f1d"} Mar 14 09:37:51 crc kubenswrapper[4956]: I0314 09:37:51.914336 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" podStartSLOduration=1.914316628 podStartE2EDuration="1.914316628s" podCreationTimestamp="2026-03-14 09:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:51.9116084 +0000 UTC m=+2477.424300668" watchObservedRunningTime="2026-03-14 09:37:51.914316628 +0000 UTC m=+2477.427008896" Mar 14 09:37:52 crc kubenswrapper[4956]: I0314 09:37:52.209373 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:52 crc kubenswrapper[4956]: I0314 09:37:52.238272 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:52 crc kubenswrapper[4956]: I0314 09:37:52.286631 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:52 crc kubenswrapper[4956]: I0314 09:37:52.901310 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-api" containerID="cri-o://adead90c8f7ee235deb6101678c78fecf006b3bea9c5a1f20b661356371de4f1" gracePeriod=30 Mar 14 09:37:52 crc kubenswrapper[4956]: I0314 09:37:52.901286 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-kuttl-api-log" containerID="cri-o://73884d6b4bb61fd74a2c93360dae3a9c83e30cd6d8b7095899833a71ae09c38d" gracePeriod=30 Mar 14 09:37:52 crc kubenswrapper[4956]: I0314 09:37:52.931725 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:37:53 crc kubenswrapper[4956]: I0314 09:37:53.944243 4956 generic.go:334] "Generic (PLEG): container finished" podID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerID="adead90c8f7ee235deb6101678c78fecf006b3bea9c5a1f20b661356371de4f1" exitCode=0 Mar 14 09:37:53 crc kubenswrapper[4956]: I0314 09:37:53.944642 4956 generic.go:334] "Generic (PLEG): container finished" podID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerID="73884d6b4bb61fd74a2c93360dae3a9c83e30cd6d8b7095899833a71ae09c38d" exitCode=143 Mar 14 09:37:53 crc kubenswrapper[4956]: I0314 09:37:53.944359 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b78ced78-9d78-4bed-ab13-0f14be6bcb47","Type":"ContainerDied","Data":"adead90c8f7ee235deb6101678c78fecf006b3bea9c5a1f20b661356371de4f1"} Mar 14 09:37:53 crc kubenswrapper[4956]: I0314 09:37:53.944738 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b78ced78-9d78-4bed-ab13-0f14be6bcb47","Type":"ContainerDied","Data":"73884d6b4bb61fd74a2c93360dae3a9c83e30cd6d8b7095899833a71ae09c38d"} Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.323152 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.428830 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b78ced78-9d78-4bed-ab13-0f14be6bcb47-logs\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.428876 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-public-tls-certs\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.428903 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-combined-ca-bundle\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.428946 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-cert-memcached-mtls\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.428998 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-config-data\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.429099 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-internal-tls-certs\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.429142 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpzvx\" (UniqueName: \"kubernetes.io/projected/b78ced78-9d78-4bed-ab13-0f14be6bcb47-kube-api-access-gpzvx\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.429164 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-custom-prometheus-ca\") pod \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\" (UID: \"b78ced78-9d78-4bed-ab13-0f14be6bcb47\") " Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.430553 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78ced78-9d78-4bed-ab13-0f14be6bcb47-logs" (OuterVolumeSpecName: "logs") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.435828 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78ced78-9d78-4bed-ab13-0f14be6bcb47-kube-api-access-gpzvx" (OuterVolumeSpecName: "kube-api-access-gpzvx") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "kube-api-access-gpzvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.467508 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.468760 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.483727 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.487300 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-config-data" (OuterVolumeSpecName: "config-data") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.489521 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.499738 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "b78ced78-9d78-4bed-ab13-0f14be6bcb47" (UID: "b78ced78-9d78-4bed-ab13-0f14be6bcb47"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531252 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpzvx\" (UniqueName: \"kubernetes.io/projected/b78ced78-9d78-4bed-ab13-0f14be6bcb47-kube-api-access-gpzvx\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531286 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531295 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b78ced78-9d78-4bed-ab13-0f14be6bcb47-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531305 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531313 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531321 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531328 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.531336 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b78ced78-9d78-4bed-ab13-0f14be6bcb47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4956]: E0314 09:37:54.577991 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:53650->38.102.83.32:40407: write tcp 38.102.83.32:53650->38.102.83.32:40407: write: broken pipe Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.962611 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"b78ced78-9d78-4bed-ab13-0f14be6bcb47","Type":"ContainerDied","Data":"886639d4f8fe5e0998dc9ecafbffb534acca3866e3a1b43c8c2e5b11cd8bf810"} Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.962660 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.962700 4956 scope.go:117] "RemoveContainer" containerID="adead90c8f7ee235deb6101678c78fecf006b3bea9c5a1f20b661356371de4f1" Mar 14 09:37:54 crc kubenswrapper[4956]: I0314 09:37:54.999180 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.007458 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.030692 4956 scope.go:117] "RemoveContainer" containerID="73884d6b4bb61fd74a2c93360dae3a9c83e30cd6d8b7095899833a71ae09c38d" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.042554 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:55 crc kubenswrapper[4956]: E0314 09:37:55.043016 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-kuttl-api-log" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.043042 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-kuttl-api-log" Mar 14 09:37:55 crc kubenswrapper[4956]: E0314 09:37:55.043064 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-api" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.043073 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-api" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.043298 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-kuttl-api-log" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.043322 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" containerName="watcher-api" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.044622 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.054254 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.056397 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.143800 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z4nf\" (UniqueName: \"kubernetes.io/projected/c408cd39-4dba-492c-b2d8-3be77e9179e2-kube-api-access-5z4nf\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.143866 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.144431 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.144642 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.144792 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.144852 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c408cd39-4dba-492c-b2d8-3be77e9179e2-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.216526 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:37:55 crc kubenswrapper[4956]: E0314 09:37:55.216828 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.225261 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78ced78-9d78-4bed-ab13-0f14be6bcb47" path="/var/lib/kubelet/pods/b78ced78-9d78-4bed-ab13-0f14be6bcb47/volumes" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.246080 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.246158 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.246201 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.246236 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c408cd39-4dba-492c-b2d8-3be77e9179e2-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.246273 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z4nf\" (UniqueName: \"kubernetes.io/projected/c408cd39-4dba-492c-b2d8-3be77e9179e2-kube-api-access-5z4nf\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.246294 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.247205 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c408cd39-4dba-492c-b2d8-3be77e9179e2-logs\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.252059 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.252130 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.253420 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.269759 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.273468 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z4nf\" (UniqueName: \"kubernetes.io/projected/c408cd39-4dba-492c-b2d8-3be77e9179e2-kube-api-access-5z4nf\") pod \"watcher-kuttl-api-0\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.380058 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.842506 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:37:55 crc kubenswrapper[4956]: I0314 09:37:55.973191 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c408cd39-4dba-492c-b2d8-3be77e9179e2","Type":"ContainerStarted","Data":"356ce07558da0f41591864ce8eb59ee15753ca5ec85819e9bc2fbe7f8541f13f"} Mar 14 09:37:56 crc kubenswrapper[4956]: I0314 09:37:56.982792 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c408cd39-4dba-492c-b2d8-3be77e9179e2","Type":"ContainerStarted","Data":"73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85"} Mar 14 09:37:56 crc kubenswrapper[4956]: I0314 09:37:56.983100 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:37:56 crc kubenswrapper[4956]: I0314 09:37:56.983112 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c408cd39-4dba-492c-b2d8-3be77e9179e2","Type":"ContainerStarted","Data":"952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d"} Mar 14 09:37:57 crc kubenswrapper[4956]: I0314 09:37:57.007813 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.007787228 podStartE2EDuration="2.007787228s" podCreationTimestamp="2026-03-14 09:37:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:37:57.004086774 +0000 UTC m=+2482.516779062" watchObservedRunningTime="2026-03-14 09:37:57.007787228 +0000 UTC m=+2482.520479486" Mar 14 09:37:57 crc kubenswrapper[4956]: I0314 09:37:57.243393 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:57 crc kubenswrapper[4956]: I0314 09:37:57.269844 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:57 crc kubenswrapper[4956]: I0314 09:37:57.991955 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:58 crc kubenswrapper[4956]: I0314 09:37:58.022201 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:37:59 crc kubenswrapper[4956]: I0314 09:37:59.295671 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:37:59 crc kubenswrapper[4956]: I0314 09:37:59.546418 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.164158 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558018-qrkhg"] Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.165288 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.168506 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.169166 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.169400 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.180928 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-qrkhg"] Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.256803 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4cz\" (UniqueName: \"kubernetes.io/projected/6db0f53f-2121-43af-bffe-84fdafb9817a-kube-api-access-4f4cz\") pod \"auto-csr-approver-29558018-qrkhg\" (UID: \"6db0f53f-2121-43af-bffe-84fdafb9817a\") " pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.358805 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4cz\" (UniqueName: \"kubernetes.io/projected/6db0f53f-2121-43af-bffe-84fdafb9817a-kube-api-access-4f4cz\") pod \"auto-csr-approver-29558018-qrkhg\" (UID: \"6db0f53f-2121-43af-bffe-84fdafb9817a\") " pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.380514 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4cz\" (UniqueName: \"kubernetes.io/projected/6db0f53f-2121-43af-bffe-84fdafb9817a-kube-api-access-4f4cz\") pod \"auto-csr-approver-29558018-qrkhg\" (UID: \"6db0f53f-2121-43af-bffe-84fdafb9817a\") " pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.380579 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.488690 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:00 crc kubenswrapper[4956]: I0314 09:38:00.774236 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-qrkhg"] Mar 14 09:38:01 crc kubenswrapper[4956]: I0314 09:38:01.017677 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" event={"ID":"6db0f53f-2121-43af-bffe-84fdafb9817a","Type":"ContainerStarted","Data":"a607ffee4437161404f5e6a34628e3e10349dc36c09eb22d49dd60c873645c51"} Mar 14 09:38:03 crc kubenswrapper[4956]: I0314 09:38:03.036812 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" event={"ID":"6db0f53f-2121-43af-bffe-84fdafb9817a","Type":"ContainerStarted","Data":"79d640ecd201324b505f6908b2335494de33703b8d9eb138bb4afdbe84e8f1bc"} Mar 14 09:38:03 crc kubenswrapper[4956]: I0314 09:38:03.053863 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" podStartSLOduration=1.278084913 podStartE2EDuration="3.053829289s" podCreationTimestamp="2026-03-14 09:38:00 +0000 UTC" firstStartedPulling="2026-03-14 09:38:00.788894411 +0000 UTC m=+2486.301586679" lastFinishedPulling="2026-03-14 09:38:02.564638787 +0000 UTC m=+2488.077331055" observedRunningTime="2026-03-14 09:38:03.052469755 +0000 UTC m=+2488.565162023" watchObservedRunningTime="2026-03-14 09:38:03.053829289 +0000 UTC m=+2488.566521557" Mar 14 09:38:04 crc kubenswrapper[4956]: I0314 09:38:04.048962 4956 generic.go:334] "Generic (PLEG): container finished" podID="6db0f53f-2121-43af-bffe-84fdafb9817a" containerID="79d640ecd201324b505f6908b2335494de33703b8d9eb138bb4afdbe84e8f1bc" exitCode=0 Mar 14 09:38:04 crc kubenswrapper[4956]: I0314 09:38:04.049054 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" event={"ID":"6db0f53f-2121-43af-bffe-84fdafb9817a","Type":"ContainerDied","Data":"79d640ecd201324b505f6908b2335494de33703b8d9eb138bb4afdbe84e8f1bc"} Mar 14 09:38:05 crc kubenswrapper[4956]: I0314 09:38:05.380907 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:38:05 crc kubenswrapper[4956]: I0314 09:38:05.390824 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:38:05 crc kubenswrapper[4956]: I0314 09:38:05.406953 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:05 crc kubenswrapper[4956]: I0314 09:38:05.455563 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f4cz\" (UniqueName: \"kubernetes.io/projected/6db0f53f-2121-43af-bffe-84fdafb9817a-kube-api-access-4f4cz\") pod \"6db0f53f-2121-43af-bffe-84fdafb9817a\" (UID: \"6db0f53f-2121-43af-bffe-84fdafb9817a\") " Mar 14 09:38:05 crc kubenswrapper[4956]: I0314 09:38:05.464136 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db0f53f-2121-43af-bffe-84fdafb9817a-kube-api-access-4f4cz" (OuterVolumeSpecName: "kube-api-access-4f4cz") pod "6db0f53f-2121-43af-bffe-84fdafb9817a" (UID: "6db0f53f-2121-43af-bffe-84fdafb9817a"). InnerVolumeSpecName "kube-api-access-4f4cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:38:05 crc kubenswrapper[4956]: I0314 09:38:05.558039 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f4cz\" (UniqueName: \"kubernetes.io/projected/6db0f53f-2121-43af-bffe-84fdafb9817a-kube-api-access-4f4cz\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:06 crc kubenswrapper[4956]: I0314 09:38:06.073705 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" Mar 14 09:38:06 crc kubenswrapper[4956]: I0314 09:38:06.073799 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-qrkhg" event={"ID":"6db0f53f-2121-43af-bffe-84fdafb9817a","Type":"ContainerDied","Data":"a607ffee4437161404f5e6a34628e3e10349dc36c09eb22d49dd60c873645c51"} Mar 14 09:38:06 crc kubenswrapper[4956]: I0314 09:38:06.074391 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a607ffee4437161404f5e6a34628e3e10349dc36c09eb22d49dd60c873645c51" Mar 14 09:38:06 crc kubenswrapper[4956]: I0314 09:38:06.080628 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:38:06 crc kubenswrapper[4956]: I0314 09:38:06.146529 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-nsvh5"] Mar 14 09:38:06 crc kubenswrapper[4956]: I0314 09:38:06.162857 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-nsvh5"] Mar 14 09:38:07 crc kubenswrapper[4956]: I0314 09:38:07.234137 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8265b6df-2302-441f-95b2-3115520e0c53" path="/var/lib/kubelet/pods/8265b6df-2302-441f-95b2-3115520e0c53/volumes" Mar 14 09:38:09 crc kubenswrapper[4956]: I0314 09:38:09.213774 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:38:09 crc kubenswrapper[4956]: E0314 09:38:09.214606 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:38:21 crc kubenswrapper[4956]: I0314 09:38:21.209260 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:38:21 crc kubenswrapper[4956]: E0314 09:38:21.210015 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:38:22 crc kubenswrapper[4956]: I0314 09:38:22.676005 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-75c5988f99-28ht4" Mar 14 09:38:22 crc kubenswrapper[4956]: I0314 09:38:22.744750 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-6b8d58878f-dcbq5"] Mar 14 09:38:22 crc kubenswrapper[4956]: I0314 09:38:22.745039 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" podUID="47b1a93a-9d42-4390-a824-7a86cd4f9be0" containerName="keystone-api" containerID="cri-o://55cbaf0e1b37a8595c5ef6237c4da433e2d1f7de027ebbc5a3c44805950cff57" gracePeriod=30 Mar 14 09:38:26 crc kubenswrapper[4956]: I0314 09:38:26.244336 4956 generic.go:334] "Generic (PLEG): container finished" podID="47b1a93a-9d42-4390-a824-7a86cd4f9be0" containerID="55cbaf0e1b37a8595c5ef6237c4da433e2d1f7de027ebbc5a3c44805950cff57" exitCode=0 Mar 14 09:38:26 crc kubenswrapper[4956]: I0314 09:38:26.244922 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" event={"ID":"47b1a93a-9d42-4390-a824-7a86cd4f9be0","Type":"ContainerDied","Data":"55cbaf0e1b37a8595c5ef6237c4da433e2d1f7de027ebbc5a3c44805950cff57"} Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.411357 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441171 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zzxq\" (UniqueName: \"kubernetes.io/projected/47b1a93a-9d42-4390-a824-7a86cd4f9be0-kube-api-access-5zzxq\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441234 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-internal-tls-certs\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441274 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-credential-keys\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441329 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-combined-ca-bundle\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441358 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-public-tls-certs\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441393 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-scripts\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441415 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-fernet-keys\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.441465 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-config-data\") pod \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\" (UID: \"47b1a93a-9d42-4390-a824-7a86cd4f9be0\") " Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.450035 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-scripts" (OuterVolumeSpecName: "scripts") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.452682 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b1a93a-9d42-4390-a824-7a86cd4f9be0-kube-api-access-5zzxq" (OuterVolumeSpecName: "kube-api-access-5zzxq") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "kube-api-access-5zzxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.453228 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.457731 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.475415 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-config-data" (OuterVolumeSpecName: "config-data") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.477618 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.489994 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.492927 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47b1a93a-9d42-4390-a824-7a86cd4f9be0" (UID: "47b1a93a-9d42-4390-a824-7a86cd4f9be0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543624 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543661 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zzxq\" (UniqueName: \"kubernetes.io/projected/47b1a93a-9d42-4390-a824-7a86cd4f9be0-kube-api-access-5zzxq\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543670 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543680 4956 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543690 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543699 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543707 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:27 crc kubenswrapper[4956]: I0314 09:38:27.543715 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47b1a93a-9d42-4390-a824-7a86cd4f9be0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:28 crc kubenswrapper[4956]: I0314 09:38:28.358894 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" event={"ID":"47b1a93a-9d42-4390-a824-7a86cd4f9be0","Type":"ContainerDied","Data":"0cf9f0458d91112c9274f0f0fbbc7fb3ee5fbed0b4eed91c137aa075f7f27cf7"} Mar 14 09:38:28 crc kubenswrapper[4956]: I0314 09:38:28.358939 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-6b8d58878f-dcbq5" Mar 14 09:38:28 crc kubenswrapper[4956]: I0314 09:38:28.359282 4956 scope.go:117] "RemoveContainer" containerID="55cbaf0e1b37a8595c5ef6237c4da433e2d1f7de027ebbc5a3c44805950cff57" Mar 14 09:38:28 crc kubenswrapper[4956]: I0314 09:38:28.413069 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-6b8d58878f-dcbq5"] Mar 14 09:38:28 crc kubenswrapper[4956]: I0314 09:38:28.429593 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-6b8d58878f-dcbq5"] Mar 14 09:38:29 crc kubenswrapper[4956]: I0314 09:38:29.224033 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b1a93a-9d42-4390-a824-7a86cd4f9be0" path="/var/lib/kubelet/pods/47b1a93a-9d42-4390-a824-7a86cd4f9be0/volumes" Mar 14 09:38:30 crc kubenswrapper[4956]: I0314 09:38:30.737930 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:38:30 crc kubenswrapper[4956]: I0314 09:38:30.739267 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-central-agent" containerID="cri-o://8a6c82a88d310589e2c13639d8725404c296e45576e0941be14e1263b40dffcc" gracePeriod=30 Mar 14 09:38:30 crc kubenswrapper[4956]: I0314 09:38:30.739361 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="sg-core" containerID="cri-o://85c78f73d76851ad60e66f3dd7318eee356254f09a7c201a5201d306d3f4c103" gracePeriod=30 Mar 14 09:38:30 crc kubenswrapper[4956]: I0314 09:38:30.739545 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-notification-agent" containerID="cri-o://043fece4492dd96929a18f91a3ed30ef7e9495e0ce541358bed56d7bf4f9ced0" gracePeriod=30 Mar 14 09:38:30 crc kubenswrapper[4956]: I0314 09:38:30.739582 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="proxy-httpd" containerID="cri-o://8526826613a3132376d5ee31843e1927f8a239198f8f22204857ac298b732c9b" gracePeriod=30 Mar 14 09:38:31 crc kubenswrapper[4956]: I0314 09:38:31.388225 4956 generic.go:334] "Generic (PLEG): container finished" podID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerID="8526826613a3132376d5ee31843e1927f8a239198f8f22204857ac298b732c9b" exitCode=0 Mar 14 09:38:31 crc kubenswrapper[4956]: I0314 09:38:31.388578 4956 generic.go:334] "Generic (PLEG): container finished" podID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerID="85c78f73d76851ad60e66f3dd7318eee356254f09a7c201a5201d306d3f4c103" exitCode=2 Mar 14 09:38:31 crc kubenswrapper[4956]: I0314 09:38:31.388592 4956 generic.go:334] "Generic (PLEG): container finished" podID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerID="8a6c82a88d310589e2c13639d8725404c296e45576e0941be14e1263b40dffcc" exitCode=0 Mar 14 09:38:31 crc kubenswrapper[4956]: I0314 09:38:31.388284 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerDied","Data":"8526826613a3132376d5ee31843e1927f8a239198f8f22204857ac298b732c9b"} Mar 14 09:38:31 crc kubenswrapper[4956]: I0314 09:38:31.388640 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerDied","Data":"85c78f73d76851ad60e66f3dd7318eee356254f09a7c201a5201d306d3f4c103"} Mar 14 09:38:31 crc kubenswrapper[4956]: I0314 09:38:31.388659 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerDied","Data":"8a6c82a88d310589e2c13639d8725404c296e45576e0941be14e1263b40dffcc"} Mar 14 09:38:32 crc kubenswrapper[4956]: I0314 09:38:32.209299 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:38:32 crc kubenswrapper[4956]: E0314 09:38:32.209852 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.429160 4956 generic.go:334] "Generic (PLEG): container finished" podID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerID="043fece4492dd96929a18f91a3ed30ef7e9495e0ce541358bed56d7bf4f9ced0" exitCode=0 Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.429459 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerDied","Data":"043fece4492dd96929a18f91a3ed30ef7e9495e0ce541358bed56d7bf4f9ced0"} Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.429966 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6732f3ac-41dd-4304-9058-fd30a7eb3f37","Type":"ContainerDied","Data":"e0c765871f5043d8fad2a0e0eb695f9e36c6937c0b03c8d70c53c22f9bd8a548"} Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.429989 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c765871f5043d8fad2a0e0eb695f9e36c6937c0b03c8d70c53c22f9bd8a548" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.430756 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535065 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-config-data\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535141 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgmx2\" (UniqueName: \"kubernetes.io/projected/6732f3ac-41dd-4304-9058-fd30a7eb3f37-kube-api-access-dgmx2\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535297 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-ceilometer-tls-certs\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535334 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-sg-core-conf-yaml\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535392 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-combined-ca-bundle\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535448 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-scripts\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535513 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-run-httpd\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.535549 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-log-httpd\") pod \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\" (UID: \"6732f3ac-41dd-4304-9058-fd30a7eb3f37\") " Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.536624 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.539711 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.543640 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-scripts" (OuterVolumeSpecName: "scripts") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.543712 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6732f3ac-41dd-4304-9058-fd30a7eb3f37-kube-api-access-dgmx2" (OuterVolumeSpecName: "kube-api-access-dgmx2") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "kube-api-access-dgmx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.575819 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.601993 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.634332 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-config-data" (OuterVolumeSpecName: "config-data") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.635314 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6732f3ac-41dd-4304-9058-fd30a7eb3f37" (UID: "6732f3ac-41dd-4304-9058-fd30a7eb3f37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638296 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638335 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgmx2\" (UniqueName: \"kubernetes.io/projected/6732f3ac-41dd-4304-9058-fd30a7eb3f37-kube-api-access-dgmx2\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638347 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638357 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638366 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638374 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6732f3ac-41dd-4304-9058-fd30a7eb3f37-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638384 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:34 crc kubenswrapper[4956]: I0314 09:38:34.638395 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6732f3ac-41dd-4304-9058-fd30a7eb3f37-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.438266 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.461706 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.469071 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484374 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:38:35 crc kubenswrapper[4956]: E0314 09:38:35.484799 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db0f53f-2121-43af-bffe-84fdafb9817a" containerName="oc" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484818 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db0f53f-2121-43af-bffe-84fdafb9817a" containerName="oc" Mar 14 09:38:35 crc kubenswrapper[4956]: E0314 09:38:35.484834 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b1a93a-9d42-4390-a824-7a86cd4f9be0" containerName="keystone-api" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484840 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b1a93a-9d42-4390-a824-7a86cd4f9be0" containerName="keystone-api" Mar 14 09:38:35 crc kubenswrapper[4956]: E0314 09:38:35.484857 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-central-agent" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484862 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-central-agent" Mar 14 09:38:35 crc kubenswrapper[4956]: E0314 09:38:35.484875 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="sg-core" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484881 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="sg-core" Mar 14 09:38:35 crc kubenswrapper[4956]: E0314 09:38:35.484894 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="proxy-httpd" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484903 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="proxy-httpd" Mar 14 09:38:35 crc kubenswrapper[4956]: E0314 09:38:35.484911 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-notification-agent" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.484917 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-notification-agent" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.485054 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db0f53f-2121-43af-bffe-84fdafb9817a" containerName="oc" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.485064 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-notification-agent" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.485075 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="proxy-httpd" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.485091 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="sg-core" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.485099 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b1a93a-9d42-4390-a824-7a86cd4f9be0" containerName="keystone-api" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.485109 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" containerName="ceilometer-central-agent" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.486858 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.489645 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.491556 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.491679 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.499694 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.653640 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.653729 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-config-data\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.653771 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-log-httpd\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.653888 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.653930 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-run-httpd\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.653960 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.654049 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-scripts\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.654155 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/8342e7d4-b4ee-48fa-b84b-692775aacba0-kube-api-access-p8wfj\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755404 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755464 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-run-httpd\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755508 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-scripts\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755576 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/8342e7d4-b4ee-48fa-b84b-692775aacba0-kube-api-access-p8wfj\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755612 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755664 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-config-data\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.755716 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-log-httpd\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.756179 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-log-httpd\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.756649 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-run-httpd\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.760173 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.760296 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-config-data\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.760774 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.762619 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-scripts\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.763405 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.773961 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/8342e7d4-b4ee-48fa-b84b-692775aacba0-kube-api-access-p8wfj\") pod \"ceilometer-0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:35 crc kubenswrapper[4956]: I0314 09:38:35.803359 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:36 crc kubenswrapper[4956]: I0314 09:38:36.385190 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:38:36 crc kubenswrapper[4956]: W0314 09:38:36.387402 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8342e7d4_b4ee_48fa_b84b_692775aacba0.slice/crio-a81c9cbec1f3dd426f553b13704e6bc9bdd2012486876b840c3a305e948f9a2c WatchSource:0}: Error finding container a81c9cbec1f3dd426f553b13704e6bc9bdd2012486876b840c3a305e948f9a2c: Status 404 returned error can't find the container with id a81c9cbec1f3dd426f553b13704e6bc9bdd2012486876b840c3a305e948f9a2c Mar 14 09:38:36 crc kubenswrapper[4956]: I0314 09:38:36.390336 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:38:36 crc kubenswrapper[4956]: I0314 09:38:36.447270 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerStarted","Data":"a81c9cbec1f3dd426f553b13704e6bc9bdd2012486876b840c3a305e948f9a2c"} Mar 14 09:38:37 crc kubenswrapper[4956]: I0314 09:38:37.225211 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6732f3ac-41dd-4304-9058-fd30a7eb3f37" path="/var/lib/kubelet/pods/6732f3ac-41dd-4304-9058-fd30a7eb3f37/volumes" Mar 14 09:38:38 crc kubenswrapper[4956]: I0314 09:38:38.480180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerStarted","Data":"ba5d7af0b5787d7566fabb9b605b6d86ec49eca5186667be141d6fbf73cb07ef"} Mar 14 09:38:39 crc kubenswrapper[4956]: I0314 09:38:39.489648 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerStarted","Data":"4f4b4dbb0bf7da9822bd591423ba49625dbb430fd045c59220600187d7fd8c71"} Mar 14 09:38:39 crc kubenswrapper[4956]: I0314 09:38:39.490223 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerStarted","Data":"59438924ff1af4b89ec559a8cc5d7c9f17c1c34956ed7ae42084f8d64ad76fc8"} Mar 14 09:38:42 crc kubenswrapper[4956]: I0314 09:38:42.518762 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerStarted","Data":"cfe179e8829f7fe8ba039db39685bde525628ea0274ea935d1996f9577d67087"} Mar 14 09:38:42 crc kubenswrapper[4956]: I0314 09:38:42.519435 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:38:42 crc kubenswrapper[4956]: I0314 09:38:42.544080 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.041008595 podStartE2EDuration="7.544063323s" podCreationTimestamp="2026-03-14 09:38:35 +0000 UTC" firstStartedPulling="2026-03-14 09:38:36.3900736 +0000 UTC m=+2521.902765868" lastFinishedPulling="2026-03-14 09:38:41.893128328 +0000 UTC m=+2527.405820596" observedRunningTime="2026-03-14 09:38:42.542114513 +0000 UTC m=+2528.054806791" watchObservedRunningTime="2026-03-14 09:38:42.544063323 +0000 UTC m=+2528.056755591" Mar 14 09:38:47 crc kubenswrapper[4956]: I0314 09:38:47.209420 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:38:47 crc kubenswrapper[4956]: E0314 09:38:47.210239 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:38:55 crc kubenswrapper[4956]: I0314 09:38:55.968194 4956 scope.go:117] "RemoveContainer" containerID="2edd9981f00a5617f93a39d6d35ad463add520753842bb1b43158b0a1c81cfb2" Mar 14 09:39:02 crc kubenswrapper[4956]: I0314 09:39:02.209320 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:39:02 crc kubenswrapper[4956]: E0314 09:39:02.210189 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.562263 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-46924"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.571387 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-46924"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.620255 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher6c35-account-delete-djlwq"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.621367 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.635026 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.644725 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher6c35-account-delete-djlwq"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.692021 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.692576 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" containerName="watcher-decision-engine" containerID="cri-o://e92f523176f348d2dbc52c610445d988896cfae9132e8df8ac13fb6c0498ea7f" gracePeriod=30 Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.742319 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="48177e62-e16a-4975-925f-a3471fb5580b" containerName="watcher-applier" containerID="cri-o://a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68" gracePeriod=30 Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.756248 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.756561 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-kuttl-api-log" containerID="cri-o://952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d" gracePeriod=30 Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.756716 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-api" containerID="cri-o://73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85" gracePeriod=30 Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.794618 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aba277b-223c-4aac-8420-c09f81cb1d66-operator-scripts\") pod \"watcher6c35-account-delete-djlwq\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.794784 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpldb\" (UniqueName: \"kubernetes.io/projected/1aba277b-223c-4aac-8420-c09f81cb1d66-kube-api-access-rpldb\") pod \"watcher6c35-account-delete-djlwq\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.830227 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.910577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aba277b-223c-4aac-8420-c09f81cb1d66-operator-scripts\") pod \"watcher6c35-account-delete-djlwq\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.910769 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpldb\" (UniqueName: \"kubernetes.io/projected/1aba277b-223c-4aac-8420-c09f81cb1d66-kube-api-access-rpldb\") pod \"watcher6c35-account-delete-djlwq\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.911866 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aba277b-223c-4aac-8420-c09f81cb1d66-operator-scripts\") pod \"watcher6c35-account-delete-djlwq\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:05 crc kubenswrapper[4956]: I0314 09:39:05.951422 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpldb\" (UniqueName: \"kubernetes.io/projected/1aba277b-223c-4aac-8420-c09f81cb1d66-kube-api-access-rpldb\") pod \"watcher6c35-account-delete-djlwq\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:06 crc kubenswrapper[4956]: I0314 09:39:06.245886 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:06 crc kubenswrapper[4956]: I0314 09:39:06.739965 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher6c35-account-delete-djlwq"] Mar 14 09:39:06 crc kubenswrapper[4956]: I0314 09:39:06.761312 4956 generic.go:334] "Generic (PLEG): container finished" podID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerID="952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d" exitCode=143 Mar 14 09:39:06 crc kubenswrapper[4956]: I0314 09:39:06.761369 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c408cd39-4dba-492c-b2d8-3be77e9179e2","Type":"ContainerDied","Data":"952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d"} Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.218449 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.223462 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.224170 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75517d04-8feb-4d02-8b05-1f6fed48f03d" path="/var/lib/kubelet/pods/75517d04-8feb-4d02-8b05-1f6fed48f03d/volumes" Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.231695 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.231785 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="48177e62-e16a-4975-925f-a3471fb5580b" containerName="watcher-applier" Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.321029 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc1c50f0_cd4b_417e_9cd0_c98a85493fb1.slice/crio-conmon-e92f523176f348d2dbc52c610445d988896cfae9132e8df8ac13fb6c0498ea7f.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.689105 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.773709 4956 generic.go:334] "Generic (PLEG): container finished" podID="1aba277b-223c-4aac-8420-c09f81cb1d66" containerID="a16980b04fdbf3bd3321ab84daf5d54fcd210c98b6326a7aa32ab71a67cfd38d" exitCode=0 Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.773817 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" event={"ID":"1aba277b-223c-4aac-8420-c09f81cb1d66","Type":"ContainerDied","Data":"a16980b04fdbf3bd3321ab84daf5d54fcd210c98b6326a7aa32ab71a67cfd38d"} Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.773847 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" event={"ID":"1aba277b-223c-4aac-8420-c09f81cb1d66","Type":"ContainerStarted","Data":"c12287a83de89134739a25a8f9d49d4f60d99ca6a509724de972b6509ff8fa89"} Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.776438 4956 generic.go:334] "Generic (PLEG): container finished" podID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerID="73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85" exitCode=0 Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.776496 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c408cd39-4dba-492c-b2d8-3be77e9179e2","Type":"ContainerDied","Data":"73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85"} Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.776514 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"c408cd39-4dba-492c-b2d8-3be77e9179e2","Type":"ContainerDied","Data":"356ce07558da0f41591864ce8eb59ee15753ca5ec85819e9bc2fbe7f8541f13f"} Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.776530 4956 scope.go:117] "RemoveContainer" containerID="73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.776611 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.780197 4956 generic.go:334] "Generic (PLEG): container finished" podID="fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" containerID="e92f523176f348d2dbc52c610445d988896cfae9132e8df8ac13fb6c0498ea7f" exitCode=0 Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.780226 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1","Type":"ContainerDied","Data":"e92f523176f348d2dbc52c610445d988896cfae9132e8df8ac13fb6c0498ea7f"} Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.833611 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.841166 4956 scope.go:117] "RemoveContainer" containerID="952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.841836 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c408cd39-4dba-492c-b2d8-3be77e9179e2-logs\") pod \"c408cd39-4dba-492c-b2d8-3be77e9179e2\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.841953 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-custom-prometheus-ca\") pod \"c408cd39-4dba-492c-b2d8-3be77e9179e2\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.842026 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-cert-memcached-mtls\") pod \"c408cd39-4dba-492c-b2d8-3be77e9179e2\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.842073 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-config-data\") pod \"c408cd39-4dba-492c-b2d8-3be77e9179e2\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.842203 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-combined-ca-bundle\") pod \"c408cd39-4dba-492c-b2d8-3be77e9179e2\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.842275 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z4nf\" (UniqueName: \"kubernetes.io/projected/c408cd39-4dba-492c-b2d8-3be77e9179e2-kube-api-access-5z4nf\") pod \"c408cd39-4dba-492c-b2d8-3be77e9179e2\" (UID: \"c408cd39-4dba-492c-b2d8-3be77e9179e2\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.842284 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c408cd39-4dba-492c-b2d8-3be77e9179e2-logs" (OuterVolumeSpecName: "logs") pod "c408cd39-4dba-492c-b2d8-3be77e9179e2" (UID: "c408cd39-4dba-492c-b2d8-3be77e9179e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.842723 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c408cd39-4dba-492c-b2d8-3be77e9179e2-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.848385 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c408cd39-4dba-492c-b2d8-3be77e9179e2-kube-api-access-5z4nf" (OuterVolumeSpecName: "kube-api-access-5z4nf") pod "c408cd39-4dba-492c-b2d8-3be77e9179e2" (UID: "c408cd39-4dba-492c-b2d8-3be77e9179e2"). InnerVolumeSpecName "kube-api-access-5z4nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.868320 4956 scope.go:117] "RemoveContainer" containerID="73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85" Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.872315 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85\": container with ID starting with 73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85 not found: ID does not exist" containerID="73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.872374 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85"} err="failed to get container status \"73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85\": rpc error: code = NotFound desc = could not find container \"73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85\": container with ID starting with 73ffaff83cc52a5cf5f960099d42881819951c83749d7d5d117ca4c24be35d85 not found: ID does not exist" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.872597 4956 scope.go:117] "RemoveContainer" containerID="952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d" Mar 14 09:39:07 crc kubenswrapper[4956]: E0314 09:39:07.873073 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d\": container with ID starting with 952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d not found: ID does not exist" containerID="952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.873133 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d"} err="failed to get container status \"952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d\": rpc error: code = NotFound desc = could not find container \"952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d\": container with ID starting with 952381d948f0f675652924c7cb21e8e03c2da0a36101decc60458562c089419d not found: ID does not exist" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.918525 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c408cd39-4dba-492c-b2d8-3be77e9179e2" (UID: "c408cd39-4dba-492c-b2d8-3be77e9179e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.937557 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c408cd39-4dba-492c-b2d8-3be77e9179e2" (UID: "c408cd39-4dba-492c-b2d8-3be77e9179e2"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.943687 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55x4k\" (UniqueName: \"kubernetes.io/projected/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-kube-api-access-55x4k\") pod \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.943731 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-cert-memcached-mtls\") pod \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.943762 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-config-data\") pod \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.943789 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-combined-ca-bundle\") pod \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.943819 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-logs\") pod \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.943958 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-custom-prometheus-ca\") pod \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\" (UID: \"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1\") " Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.944453 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.944470 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.944479 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z4nf\" (UniqueName: \"kubernetes.io/projected/c408cd39-4dba-492c-b2d8-3be77e9179e2-kube-api-access-5z4nf\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.947430 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-kube-api-access-55x4k" (OuterVolumeSpecName: "kube-api-access-55x4k") pod "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" (UID: "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1"). InnerVolumeSpecName "kube-api-access-55x4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.948574 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-logs" (OuterVolumeSpecName: "logs") pod "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" (UID: "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.957049 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-config-data" (OuterVolumeSpecName: "config-data") pod "c408cd39-4dba-492c-b2d8-3be77e9179e2" (UID: "c408cd39-4dba-492c-b2d8-3be77e9179e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.962255 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "c408cd39-4dba-492c-b2d8-3be77e9179e2" (UID: "c408cd39-4dba-492c-b2d8-3be77e9179e2"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.971195 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" (UID: "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.976682 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" (UID: "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:07 crc kubenswrapper[4956]: I0314 09:39:07.986614 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-config-data" (OuterVolumeSpecName: "config-data") pod "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" (UID: "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.030874 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" (UID: "fc1c50f0-cd4b-417e-9cd0-c98a85493fb1"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.045988 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046024 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046037 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c408cd39-4dba-492c-b2d8-3be77e9179e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046047 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55x4k\" (UniqueName: \"kubernetes.io/projected/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-kube-api-access-55x4k\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046058 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046067 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046075 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.046083 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.110266 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.118262 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.376532 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.376892 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-central-agent" containerID="cri-o://ba5d7af0b5787d7566fabb9b605b6d86ec49eca5186667be141d6fbf73cb07ef" gracePeriod=30 Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.377051 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="sg-core" containerID="cri-o://4f4b4dbb0bf7da9822bd591423ba49625dbb430fd045c59220600187d7fd8c71" gracePeriod=30 Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.377173 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-notification-agent" containerID="cri-o://59438924ff1af4b89ec559a8cc5d7c9f17c1c34956ed7ae42084f8d64ad76fc8" gracePeriod=30 Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.377177 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="proxy-httpd" containerID="cri-o://cfe179e8829f7fe8ba039db39685bde525628ea0274ea935d1996f9577d67087" gracePeriod=30 Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.793770 4956 generic.go:334] "Generic (PLEG): container finished" podID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerID="cfe179e8829f7fe8ba039db39685bde525628ea0274ea935d1996f9577d67087" exitCode=0 Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.793802 4956 generic.go:334] "Generic (PLEG): container finished" podID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerID="4f4b4dbb0bf7da9822bd591423ba49625dbb430fd045c59220600187d7fd8c71" exitCode=2 Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.793845 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerDied","Data":"cfe179e8829f7fe8ba039db39685bde525628ea0274ea935d1996f9577d67087"} Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.794375 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerDied","Data":"4f4b4dbb0bf7da9822bd591423ba49625dbb430fd045c59220600187d7fd8c71"} Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.799391 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.799585 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc1c50f0-cd4b-417e-9cd0-c98a85493fb1","Type":"ContainerDied","Data":"4fce818e2507373959d5dcc2f177ea77e721ab6c32299e9c80307f4760cf48d1"} Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.799699 4956 scope.go:117] "RemoveContainer" containerID="e92f523176f348d2dbc52c610445d988896cfae9132e8df8ac13fb6c0498ea7f" Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.842137 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:08 crc kubenswrapper[4956]: I0314 09:39:08.854244 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.150625 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.222306 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" path="/var/lib/kubelet/pods/c408cd39-4dba-492c-b2d8-3be77e9179e2/volumes" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.223160 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" path="/var/lib/kubelet/pods/fc1c50f0-cd4b-417e-9cd0-c98a85493fb1/volumes" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.265415 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpldb\" (UniqueName: \"kubernetes.io/projected/1aba277b-223c-4aac-8420-c09f81cb1d66-kube-api-access-rpldb\") pod \"1aba277b-223c-4aac-8420-c09f81cb1d66\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.265626 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aba277b-223c-4aac-8420-c09f81cb1d66-operator-scripts\") pod \"1aba277b-223c-4aac-8420-c09f81cb1d66\" (UID: \"1aba277b-223c-4aac-8420-c09f81cb1d66\") " Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.266390 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aba277b-223c-4aac-8420-c09f81cb1d66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1aba277b-223c-4aac-8420-c09f81cb1d66" (UID: "1aba277b-223c-4aac-8420-c09f81cb1d66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.282171 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aba277b-223c-4aac-8420-c09f81cb1d66-kube-api-access-rpldb" (OuterVolumeSpecName: "kube-api-access-rpldb") pod "1aba277b-223c-4aac-8420-c09f81cb1d66" (UID: "1aba277b-223c-4aac-8420-c09f81cb1d66"). InnerVolumeSpecName "kube-api-access-rpldb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.367728 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aba277b-223c-4aac-8420-c09f81cb1d66-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.367764 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpldb\" (UniqueName: \"kubernetes.io/projected/1aba277b-223c-4aac-8420-c09f81cb1d66-kube-api-access-rpldb\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.812772 4956 generic.go:334] "Generic (PLEG): container finished" podID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerID="59438924ff1af4b89ec559a8cc5d7c9f17c1c34956ed7ae42084f8d64ad76fc8" exitCode=0 Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.813113 4956 generic.go:334] "Generic (PLEG): container finished" podID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerID="ba5d7af0b5787d7566fabb9b605b6d86ec49eca5186667be141d6fbf73cb07ef" exitCode=0 Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.812979 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerDied","Data":"59438924ff1af4b89ec559a8cc5d7c9f17c1c34956ed7ae42084f8d64ad76fc8"} Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.813183 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerDied","Data":"ba5d7af0b5787d7566fabb9b605b6d86ec49eca5186667be141d6fbf73cb07ef"} Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.817987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" event={"ID":"1aba277b-223c-4aac-8420-c09f81cb1d66","Type":"ContainerDied","Data":"c12287a83de89134739a25a8f9d49d4f60d99ca6a509724de972b6509ff8fa89"} Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.818046 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c12287a83de89134739a25a8f9d49d4f60d99ca6a509724de972b6509ff8fa89" Mar 14 09:39:09 crc kubenswrapper[4956]: I0314 09:39:09.818116 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6c35-account-delete-djlwq" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.165720 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286447 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-combined-ca-bundle\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286629 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-scripts\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286655 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-ceilometer-tls-certs\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286712 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-log-httpd\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286785 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-sg-core-conf-yaml\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286811 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-config-data\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286831 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/8342e7d4-b4ee-48fa-b84b-692775aacba0-kube-api-access-p8wfj\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.286887 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-run-httpd\") pod \"8342e7d4-b4ee-48fa-b84b-692775aacba0\" (UID: \"8342e7d4-b4ee-48fa-b84b-692775aacba0\") " Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.288529 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.288620 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.295378 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8342e7d4-b4ee-48fa-b84b-692775aacba0-kube-api-access-p8wfj" (OuterVolumeSpecName: "kube-api-access-p8wfj") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "kube-api-access-p8wfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.308650 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-scripts" (OuterVolumeSpecName: "scripts") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.365623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.389833 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.390950 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.391762 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.391849 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.391932 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.391994 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/8342e7d4-b4ee-48fa-b84b-692775aacba0-kube-api-access-p8wfj\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.392051 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8342e7d4-b4ee-48fa-b84b-692775aacba0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.397567 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.437774 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-config-data" (OuterVolumeSpecName: "config-data") pod "8342e7d4-b4ee-48fa-b84b-692775aacba0" (UID: "8342e7d4-b4ee-48fa-b84b-692775aacba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.493570 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.493605 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8342e7d4-b4ee-48fa-b84b-692775aacba0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.691577 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-s62jg"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.704885 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-s62jg"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.721868 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher6c35-account-delete-djlwq"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.728247 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.740193 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher6c35-account-delete-djlwq"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.747530 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-6c35-account-create-update-tbl6r"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.837177 4956 generic.go:334] "Generic (PLEG): container finished" podID="48177e62-e16a-4975-925f-a3471fb5580b" containerID="a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68" exitCode=0 Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.837264 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"48177e62-e16a-4975-925f-a3471fb5580b","Type":"ContainerDied","Data":"a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68"} Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.840220 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8342e7d4-b4ee-48fa-b84b-692775aacba0","Type":"ContainerDied","Data":"a81c9cbec1f3dd426f553b13704e6bc9bdd2012486876b840c3a305e948f9a2c"} Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.840260 4956 scope.go:117] "RemoveContainer" containerID="cfe179e8829f7fe8ba039db39685bde525628ea0274ea935d1996f9577d67087" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.840376 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.874691 4956 scope.go:117] "RemoveContainer" containerID="4f4b4dbb0bf7da9822bd591423ba49625dbb430fd045c59220600187d7fd8c71" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.876817 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.894139 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.905836 4956 scope.go:117] "RemoveContainer" containerID="59438924ff1af4b89ec559a8cc5d7c9f17c1c34956ed7ae42084f8d64ad76fc8" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.921797 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922136 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" containerName="watcher-decision-engine" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922151 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" containerName="watcher-decision-engine" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922169 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-kuttl-api-log" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922177 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-kuttl-api-log" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922189 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="sg-core" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922195 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="sg-core" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922209 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-api" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922215 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-api" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922224 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-central-agent" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922230 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-central-agent" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922240 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-notification-agent" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922247 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-notification-agent" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922255 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="proxy-httpd" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922263 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="proxy-httpd" Mar 14 09:39:10 crc kubenswrapper[4956]: E0314 09:39:10.922271 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aba277b-223c-4aac-8420-c09f81cb1d66" containerName="mariadb-account-delete" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922277 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aba277b-223c-4aac-8420-c09f81cb1d66" containerName="mariadb-account-delete" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922431 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1c50f0-cd4b-417e-9cd0-c98a85493fb1" containerName="watcher-decision-engine" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922444 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-central-agent" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922454 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="ceilometer-notification-agent" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922463 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="sg-core" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922471 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-api" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922497 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c408cd39-4dba-492c-b2d8-3be77e9179e2" containerName="watcher-kuttl-api-log" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922505 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" containerName="proxy-httpd" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.922515 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aba277b-223c-4aac-8420-c09f81cb1d66" containerName="mariadb-account-delete" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.929048 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.932685 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.934877 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.935685 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.939401 4956 scope.go:117] "RemoveContainer" containerID="ba5d7af0b5787d7566fabb9b605b6d86ec49eca5186667be141d6fbf73cb07ef" Mar 14 09:39:10 crc kubenswrapper[4956]: I0314 09:39:10.956114 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.062027 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113268 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113336 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-scripts\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113381 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-config-data\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113405 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmksh\" (UniqueName: \"kubernetes.io/projected/dbdeee31-d317-47d5-a88b-81086f0da9ad-kube-api-access-cmksh\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113436 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-log-httpd\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113468 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.113521 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-run-httpd\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.214317 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-config-data\") pod \"48177e62-e16a-4975-925f-a3471fb5580b\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.216262 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48177e62-e16a-4975-925f-a3471fb5580b-logs\") pod \"48177e62-e16a-4975-925f-a3471fb5580b\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.216301 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-combined-ca-bundle\") pod \"48177e62-e16a-4975-925f-a3471fb5580b\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.216412 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-cert-memcached-mtls\") pod \"48177e62-e16a-4975-925f-a3471fb5580b\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.216639 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ng94\" (UniqueName: \"kubernetes.io/projected/48177e62-e16a-4975-925f-a3471fb5580b-kube-api-access-2ng94\") pod \"48177e62-e16a-4975-925f-a3471fb5580b\" (UID: \"48177e62-e16a-4975-925f-a3471fb5580b\") " Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.217086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-scripts\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.217171 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48177e62-e16a-4975-925f-a3471fb5580b-logs" (OuterVolumeSpecName: "logs") pod "48177e62-e16a-4975-925f-a3471fb5580b" (UID: "48177e62-e16a-4975-925f-a3471fb5580b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.218849 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-config-data\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.218897 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmksh\" (UniqueName: \"kubernetes.io/projected/dbdeee31-d317-47d5-a88b-81086f0da9ad-kube-api-access-cmksh\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.218956 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-log-httpd\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.219029 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.219105 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-run-httpd\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.219171 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.219285 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.219615 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48177e62-e16a-4975-925f-a3471fb5580b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.219815 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-log-httpd\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.221084 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48177e62-e16a-4975-925f-a3471fb5580b-kube-api-access-2ng94" (OuterVolumeSpecName: "kube-api-access-2ng94") pod "48177e62-e16a-4975-925f-a3471fb5580b" (UID: "48177e62-e16a-4975-925f-a3471fb5580b"). InnerVolumeSpecName "kube-api-access-2ng94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.221343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-scripts\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.222499 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-run-httpd\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.222749 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aba277b-223c-4aac-8420-c09f81cb1d66" path="/var/lib/kubelet/pods/1aba277b-223c-4aac-8420-c09f81cb1d66/volumes" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.223429 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8342e7d4-b4ee-48fa-b84b-692775aacba0" path="/var/lib/kubelet/pods/8342e7d4-b4ee-48fa-b84b-692775aacba0/volumes" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.224235 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96054dc2-caf1-4272-9615-05b7611d9644" path="/var/lib/kubelet/pods/96054dc2-caf1-4272-9615-05b7611d9644/volumes" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.225280 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-config-data\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.224339 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.224519 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.224760 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.225941 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5" path="/var/lib/kubelet/pods/c28fba1d-4efd-4cf3-8f90-6a2df7ebbef5/volumes" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.236263 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmksh\" (UniqueName: \"kubernetes.io/projected/dbdeee31-d317-47d5-a88b-81086f0da9ad-kube-api-access-cmksh\") pod \"ceilometer-0\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.242770 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48177e62-e16a-4975-925f-a3471fb5580b" (UID: "48177e62-e16a-4975-925f-a3471fb5580b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.253253 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.267986 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-config-data" (OuterVolumeSpecName: "config-data") pod "48177e62-e16a-4975-925f-a3471fb5580b" (UID: "48177e62-e16a-4975-925f-a3471fb5580b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.292006 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "48177e62-e16a-4975-925f-a3471fb5580b" (UID: "48177e62-e16a-4975-925f-a3471fb5580b"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.321385 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.321420 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.321440 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/48177e62-e16a-4975-925f-a3471fb5580b-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.321452 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ng94\" (UniqueName: \"kubernetes.io/projected/48177e62-e16a-4975-925f-a3471fb5580b-kube-api-access-2ng94\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.721114 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.851908 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.851905 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"48177e62-e16a-4975-925f-a3471fb5580b","Type":"ContainerDied","Data":"bdc5fb76834f7f31a00a1c40c4c18e33db1f99081290ca3d316c1c068c5957da"} Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.851997 4956 scope.go:117] "RemoveContainer" containerID="a9d493c45276a19adb91d13fdc1cda447b7b7a3d3d9193682c2670d6776fde68" Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.854178 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerStarted","Data":"ea69fbfef0ace36d0f422278f3b058f13144a5a99eb1b7203943e797ae016647"} Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.883855 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:11 crc kubenswrapper[4956]: I0314 09:39:11.890544 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:12 crc kubenswrapper[4956]: I0314 09:39:12.868647 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerStarted","Data":"e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0"} Mar 14 09:39:13 crc kubenswrapper[4956]: I0314 09:39:13.220872 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48177e62-e16a-4975-925f-a3471fb5580b" path="/var/lib/kubelet/pods/48177e62-e16a-4975-925f-a3471fb5580b/volumes" Mar 14 09:39:13 crc kubenswrapper[4956]: I0314 09:39:13.879305 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerStarted","Data":"aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba"} Mar 14 09:39:13 crc kubenswrapper[4956]: I0314 09:39:13.879378 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerStarted","Data":"b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36"} Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.468651 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-mp7mc"] Mar 14 09:39:15 crc kubenswrapper[4956]: E0314 09:39:15.469387 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48177e62-e16a-4975-925f-a3471fb5580b" containerName="watcher-applier" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.469401 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="48177e62-e16a-4975-925f-a3471fb5580b" containerName="watcher-applier" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.469567 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="48177e62-e16a-4975-925f-a3471fb5580b" containerName="watcher-applier" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.470177 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.475895 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-06fc-account-create-update-kvddn"] Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.477211 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.479107 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.485582 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-mp7mc"] Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.495306 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-06fc-account-create-update-kvddn"] Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.594227 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674a1847-6823-446f-9784-88f3bb7055b9-operator-scripts\") pod \"watcher-db-create-mp7mc\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.594518 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtrz\" (UniqueName: \"kubernetes.io/projected/65fb877a-5cef-4754-be35-f7bd08a21b07-kube-api-access-hhtrz\") pod \"watcher-06fc-account-create-update-kvddn\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.594558 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65fb877a-5cef-4754-be35-f7bd08a21b07-operator-scripts\") pod \"watcher-06fc-account-create-update-kvddn\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.594594 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvvd\" (UniqueName: \"kubernetes.io/projected/674a1847-6823-446f-9784-88f3bb7055b9-kube-api-access-khvvd\") pod \"watcher-db-create-mp7mc\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.696509 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674a1847-6823-446f-9784-88f3bb7055b9-operator-scripts\") pod \"watcher-db-create-mp7mc\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.696577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtrz\" (UniqueName: \"kubernetes.io/projected/65fb877a-5cef-4754-be35-f7bd08a21b07-kube-api-access-hhtrz\") pod \"watcher-06fc-account-create-update-kvddn\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.696629 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65fb877a-5cef-4754-be35-f7bd08a21b07-operator-scripts\") pod \"watcher-06fc-account-create-update-kvddn\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.696682 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khvvd\" (UniqueName: \"kubernetes.io/projected/674a1847-6823-446f-9784-88f3bb7055b9-kube-api-access-khvvd\") pod \"watcher-db-create-mp7mc\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.697444 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674a1847-6823-446f-9784-88f3bb7055b9-operator-scripts\") pod \"watcher-db-create-mp7mc\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.697651 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65fb877a-5cef-4754-be35-f7bd08a21b07-operator-scripts\") pod \"watcher-06fc-account-create-update-kvddn\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.716436 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtrz\" (UniqueName: \"kubernetes.io/projected/65fb877a-5cef-4754-be35-f7bd08a21b07-kube-api-access-hhtrz\") pod \"watcher-06fc-account-create-update-kvddn\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.718284 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khvvd\" (UniqueName: \"kubernetes.io/projected/674a1847-6823-446f-9784-88f3bb7055b9-kube-api-access-khvvd\") pod \"watcher-db-create-mp7mc\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.787733 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.798142 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.918811 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerStarted","Data":"790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089"} Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.919879 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:15 crc kubenswrapper[4956]: I0314 09:39:15.955343 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.579442895 podStartE2EDuration="5.955323832s" podCreationTimestamp="2026-03-14 09:39:10 +0000 UTC" firstStartedPulling="2026-03-14 09:39:11.731174405 +0000 UTC m=+2557.243866673" lastFinishedPulling="2026-03-14 09:39:15.107055342 +0000 UTC m=+2560.619747610" observedRunningTime="2026-03-14 09:39:15.943240967 +0000 UTC m=+2561.455933235" watchObservedRunningTime="2026-03-14 09:39:15.955323832 +0000 UTC m=+2561.468016100" Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.209607 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:39:16 crc kubenswrapper[4956]: E0314 09:39:16.209925 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.351232 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-mp7mc"] Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.463885 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-06fc-account-create-update-kvddn"] Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.929002 4956 generic.go:334] "Generic (PLEG): container finished" podID="65fb877a-5cef-4754-be35-f7bd08a21b07" containerID="8018ec2c9894974394dd592ff406206f4e5a3f8fa4d1c765f05346af4117cd40" exitCode=0 Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.929099 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" event={"ID":"65fb877a-5cef-4754-be35-f7bd08a21b07","Type":"ContainerDied","Data":"8018ec2c9894974394dd592ff406206f4e5a3f8fa4d1c765f05346af4117cd40"} Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.929416 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" event={"ID":"65fb877a-5cef-4754-be35-f7bd08a21b07","Type":"ContainerStarted","Data":"1a7627aa7c851ed253b7c14ed42e7e82a7a6db7fcbff7b59f93d2bce3f6ad220"} Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.931162 4956 generic.go:334] "Generic (PLEG): container finished" podID="674a1847-6823-446f-9784-88f3bb7055b9" containerID="06c669332687ddd7bdfba4b48df4ca21c46d7b7a73dd332109daf5ff47a1c668" exitCode=0 Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.931229 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-mp7mc" event={"ID":"674a1847-6823-446f-9784-88f3bb7055b9","Type":"ContainerDied","Data":"06c669332687ddd7bdfba4b48df4ca21c46d7b7a73dd332109daf5ff47a1c668"} Mar 14 09:39:16 crc kubenswrapper[4956]: I0314 09:39:16.931290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-mp7mc" event={"ID":"674a1847-6823-446f-9784-88f3bb7055b9","Type":"ContainerStarted","Data":"e4ecb902f18801b1be2c1c20ade3d26d8b47ca4ef43ed4345b88e8c320664eda"} Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.378361 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.383208 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.547105 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674a1847-6823-446f-9784-88f3bb7055b9-operator-scripts\") pod \"674a1847-6823-446f-9784-88f3bb7055b9\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.547267 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhtrz\" (UniqueName: \"kubernetes.io/projected/65fb877a-5cef-4754-be35-f7bd08a21b07-kube-api-access-hhtrz\") pod \"65fb877a-5cef-4754-be35-f7bd08a21b07\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.547297 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65fb877a-5cef-4754-be35-f7bd08a21b07-operator-scripts\") pod \"65fb877a-5cef-4754-be35-f7bd08a21b07\" (UID: \"65fb877a-5cef-4754-be35-f7bd08a21b07\") " Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.547350 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khvvd\" (UniqueName: \"kubernetes.io/projected/674a1847-6823-446f-9784-88f3bb7055b9-kube-api-access-khvvd\") pod \"674a1847-6823-446f-9784-88f3bb7055b9\" (UID: \"674a1847-6823-446f-9784-88f3bb7055b9\") " Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.548231 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674a1847-6823-446f-9784-88f3bb7055b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "674a1847-6823-446f-9784-88f3bb7055b9" (UID: "674a1847-6823-446f-9784-88f3bb7055b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.548403 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fb877a-5cef-4754-be35-f7bd08a21b07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65fb877a-5cef-4754-be35-f7bd08a21b07" (UID: "65fb877a-5cef-4754-be35-f7bd08a21b07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.552812 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674a1847-6823-446f-9784-88f3bb7055b9-kube-api-access-khvvd" (OuterVolumeSpecName: "kube-api-access-khvvd") pod "674a1847-6823-446f-9784-88f3bb7055b9" (UID: "674a1847-6823-446f-9784-88f3bb7055b9"). InnerVolumeSpecName "kube-api-access-khvvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.560587 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fb877a-5cef-4754-be35-f7bd08a21b07-kube-api-access-hhtrz" (OuterVolumeSpecName: "kube-api-access-hhtrz") pod "65fb877a-5cef-4754-be35-f7bd08a21b07" (UID: "65fb877a-5cef-4754-be35-f7bd08a21b07"). InnerVolumeSpecName "kube-api-access-hhtrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.648927 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674a1847-6823-446f-9784-88f3bb7055b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.648959 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhtrz\" (UniqueName: \"kubernetes.io/projected/65fb877a-5cef-4754-be35-f7bd08a21b07-kube-api-access-hhtrz\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.648971 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65fb877a-5cef-4754-be35-f7bd08a21b07-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.648981 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khvvd\" (UniqueName: \"kubernetes.io/projected/674a1847-6823-446f-9784-88f3bb7055b9-kube-api-access-khvvd\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.948208 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-mp7mc" event={"ID":"674a1847-6823-446f-9784-88f3bb7055b9","Type":"ContainerDied","Data":"e4ecb902f18801b1be2c1c20ade3d26d8b47ca4ef43ed4345b88e8c320664eda"} Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.948261 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4ecb902f18801b1be2c1c20ade3d26d8b47ca4ef43ed4345b88e8c320664eda" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.948319 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-mp7mc" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.949768 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" event={"ID":"65fb877a-5cef-4754-be35-f7bd08a21b07","Type":"ContainerDied","Data":"1a7627aa7c851ed253b7c14ed42e7e82a7a6db7fcbff7b59f93d2bce3f6ad220"} Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.949795 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a7627aa7c851ed253b7c14ed42e7e82a7a6db7fcbff7b59f93d2bce3f6ad220" Mar 14 09:39:18 crc kubenswrapper[4956]: I0314 09:39:18.949852 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-06fc-account-create-update-kvddn" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.311912 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x286z"] Mar 14 09:39:21 crc kubenswrapper[4956]: E0314 09:39:21.312671 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674a1847-6823-446f-9784-88f3bb7055b9" containerName="mariadb-database-create" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.312692 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="674a1847-6823-446f-9784-88f3bb7055b9" containerName="mariadb-database-create" Mar 14 09:39:21 crc kubenswrapper[4956]: E0314 09:39:21.312707 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fb877a-5cef-4754-be35-f7bd08a21b07" containerName="mariadb-account-create-update" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.312714 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fb877a-5cef-4754-be35-f7bd08a21b07" containerName="mariadb-account-create-update" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.312919 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="674a1847-6823-446f-9784-88f3bb7055b9" containerName="mariadb-database-create" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.312943 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fb877a-5cef-4754-be35-f7bd08a21b07" containerName="mariadb-account-create-update" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.313639 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.315740 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.316543 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9mnr9" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.324454 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x286z"] Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.400415 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-config-data\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.400530 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.400949 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.401011 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldbmp\" (UniqueName: \"kubernetes.io/projected/2ce9fb31-8246-44a1-9bd5-982abf68595d-kube-api-access-ldbmp\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.503086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldbmp\" (UniqueName: \"kubernetes.io/projected/2ce9fb31-8246-44a1-9bd5-982abf68595d-kube-api-access-ldbmp\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.503521 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-config-data\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.503605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.505643 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.520354 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.520446 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.520950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-config-data\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.528103 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldbmp\" (UniqueName: \"kubernetes.io/projected/2ce9fb31-8246-44a1-9bd5-982abf68595d-kube-api-access-ldbmp\") pod \"watcher-kuttl-db-sync-x286z\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:21 crc kubenswrapper[4956]: I0314 09:39:21.631763 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:22 crc kubenswrapper[4956]: I0314 09:39:22.095927 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x286z"] Mar 14 09:39:22 crc kubenswrapper[4956]: I0314 09:39:22.983941 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" event={"ID":"2ce9fb31-8246-44a1-9bd5-982abf68595d","Type":"ContainerStarted","Data":"6f260e8f33b2163b744feb503a20b7d1d3d7dbf980ad0e5ed426cf64bc4dbac8"} Mar 14 09:39:22 crc kubenswrapper[4956]: I0314 09:39:22.984267 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" event={"ID":"2ce9fb31-8246-44a1-9bd5-982abf68595d","Type":"ContainerStarted","Data":"a91581f51310dc90a7bcb1759dded2b455f9548db5ab4fe23fee130c2591e576"} Mar 14 09:39:23 crc kubenswrapper[4956]: I0314 09:39:23.002379 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" podStartSLOduration=2.002361859 podStartE2EDuration="2.002361859s" podCreationTimestamp="2026-03-14 09:39:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:22.99885326 +0000 UTC m=+2568.511545528" watchObservedRunningTime="2026-03-14 09:39:23.002361859 +0000 UTC m=+2568.515054127" Mar 14 09:39:25 crc kubenswrapper[4956]: I0314 09:39:25.000218 4956 generic.go:334] "Generic (PLEG): container finished" podID="2ce9fb31-8246-44a1-9bd5-982abf68595d" containerID="6f260e8f33b2163b744feb503a20b7d1d3d7dbf980ad0e5ed426cf64bc4dbac8" exitCode=0 Mar 14 09:39:25 crc kubenswrapper[4956]: I0314 09:39:25.000335 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" event={"ID":"2ce9fb31-8246-44a1-9bd5-982abf68595d","Type":"ContainerDied","Data":"6f260e8f33b2163b744feb503a20b7d1d3d7dbf980ad0e5ed426cf64bc4dbac8"} Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.366933 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.485361 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-db-sync-config-data\") pod \"2ce9fb31-8246-44a1-9bd5-982abf68595d\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.485416 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-config-data\") pod \"2ce9fb31-8246-44a1-9bd5-982abf68595d\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.485530 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldbmp\" (UniqueName: \"kubernetes.io/projected/2ce9fb31-8246-44a1-9bd5-982abf68595d-kube-api-access-ldbmp\") pod \"2ce9fb31-8246-44a1-9bd5-982abf68595d\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.485620 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-combined-ca-bundle\") pod \"2ce9fb31-8246-44a1-9bd5-982abf68595d\" (UID: \"2ce9fb31-8246-44a1-9bd5-982abf68595d\") " Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.491874 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2ce9fb31-8246-44a1-9bd5-982abf68595d" (UID: "2ce9fb31-8246-44a1-9bd5-982abf68595d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.494831 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce9fb31-8246-44a1-9bd5-982abf68595d-kube-api-access-ldbmp" (OuterVolumeSpecName: "kube-api-access-ldbmp") pod "2ce9fb31-8246-44a1-9bd5-982abf68595d" (UID: "2ce9fb31-8246-44a1-9bd5-982abf68595d"). InnerVolumeSpecName "kube-api-access-ldbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.518826 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ce9fb31-8246-44a1-9bd5-982abf68595d" (UID: "2ce9fb31-8246-44a1-9bd5-982abf68595d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.559391 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-config-data" (OuterVolumeSpecName: "config-data") pod "2ce9fb31-8246-44a1-9bd5-982abf68595d" (UID: "2ce9fb31-8246-44a1-9bd5-982abf68595d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.588075 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.588151 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldbmp\" (UniqueName: \"kubernetes.io/projected/2ce9fb31-8246-44a1-9bd5-982abf68595d-kube-api-access-ldbmp\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.588165 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:26 crc kubenswrapper[4956]: I0314 09:39:26.588175 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ce9fb31-8246-44a1-9bd5-982abf68595d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.024982 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" event={"ID":"2ce9fb31-8246-44a1-9bd5-982abf68595d","Type":"ContainerDied","Data":"a91581f51310dc90a7bcb1759dded2b455f9548db5ab4fe23fee130c2591e576"} Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.025039 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a91581f51310dc90a7bcb1759dded2b455f9548db5ab4fe23fee130c2591e576" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.025073 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x286z" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.332686 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:27 crc kubenswrapper[4956]: E0314 09:39:27.333150 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce9fb31-8246-44a1-9bd5-982abf68595d" containerName="watcher-kuttl-db-sync" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.333173 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce9fb31-8246-44a1-9bd5-982abf68595d" containerName="watcher-kuttl-db-sync" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.333361 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce9fb31-8246-44a1-9bd5-982abf68595d" containerName="watcher-kuttl-db-sync" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.334419 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.336625 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.338026 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9mnr9" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.347152 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.361854 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.363176 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.367217 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.397898 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.403316 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.403363 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aee30952-18d5-4640-9f7b-6e219d65f30c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.403404 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.403446 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.403513 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qkl4\" (UniqueName: \"kubernetes.io/projected/aee30952-18d5-4640-9f7b-6e219d65f30c-kube-api-access-7qkl4\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.403547 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.456165 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.457342 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.460020 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.472159 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.504839 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.504908 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.504935 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.504965 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aee30952-18d5-4640-9f7b-6e219d65f30c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505017 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505064 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5hl\" (UniqueName: \"kubernetes.io/projected/ff629ac3-608b-43fa-89b1-ed416b1392ca-kube-api-access-nn5hl\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505107 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505138 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qkl4\" (UniqueName: \"kubernetes.io/projected/aee30952-18d5-4640-9f7b-6e219d65f30c-kube-api-access-7qkl4\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff629ac3-608b-43fa-89b1-ed416b1392ca-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.505187 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.506087 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aee30952-18d5-4640-9f7b-6e219d65f30c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.509150 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.509216 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.510586 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.517176 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.522669 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qkl4\" (UniqueName: \"kubernetes.io/projected/aee30952-18d5-4640-9f7b-6e219d65f30c-kube-api-access-7qkl4\") pod \"watcher-kuttl-api-0\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607055 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx2gk\" (UniqueName: \"kubernetes.io/projected/0660ab33-2747-4326-8901-542c807ca75d-kube-api-access-jx2gk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607131 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607163 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607235 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607288 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607305 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607689 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5hl\" (UniqueName: \"kubernetes.io/projected/ff629ac3-608b-43fa-89b1-ed416b1392ca-kube-api-access-nn5hl\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607722 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607806 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0660ab33-2747-4326-8901-542c807ca75d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.607857 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff629ac3-608b-43fa-89b1-ed416b1392ca-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.608266 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff629ac3-608b-43fa-89b1-ed416b1392ca-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.611143 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.611722 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.612630 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.625097 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5hl\" (UniqueName: \"kubernetes.io/projected/ff629ac3-608b-43fa-89b1-ed416b1392ca-kube-api-access-nn5hl\") pod \"watcher-kuttl-applier-0\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.654097 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.689141 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709052 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0660ab33-2747-4326-8901-542c807ca75d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709338 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx2gk\" (UniqueName: \"kubernetes.io/projected/0660ab33-2747-4326-8901-542c807ca75d-kube-api-access-jx2gk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709387 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709410 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709432 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709495 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.709730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0660ab33-2747-4326-8901-542c807ca75d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.713366 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.718645 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.718853 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.720689 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.729443 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx2gk\" (UniqueName: \"kubernetes.io/projected/0660ab33-2747-4326-8901-542c807ca75d-kube-api-access-jx2gk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:27 crc kubenswrapper[4956]: I0314 09:39:27.771792 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:28 crc kubenswrapper[4956]: I0314 09:39:28.110806 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:28 crc kubenswrapper[4956]: I0314 09:39:28.218147 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:28 crc kubenswrapper[4956]: W0314 09:39:28.226540 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff629ac3_608b_43fa_89b1_ed416b1392ca.slice/crio-989fa6e3fdd1fd3278ee40deb791fd8aa42d87a8771888ae7e396df70e3355a4 WatchSource:0}: Error finding container 989fa6e3fdd1fd3278ee40deb791fd8aa42d87a8771888ae7e396df70e3355a4: Status 404 returned error can't find the container with id 989fa6e3fdd1fd3278ee40deb791fd8aa42d87a8771888ae7e396df70e3355a4 Mar 14 09:39:28 crc kubenswrapper[4956]: I0314 09:39:28.294556 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.043078 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"aee30952-18d5-4640-9f7b-6e219d65f30c","Type":"ContainerStarted","Data":"59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.043398 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"aee30952-18d5-4640-9f7b-6e219d65f30c","Type":"ContainerStarted","Data":"8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.043505 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.043516 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"aee30952-18d5-4640-9f7b-6e219d65f30c","Type":"ContainerStarted","Data":"b661cdabcbbf67d30146b6c6851f6643c26648cf6ba422203f78f951cbeb6e56"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.045831 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ff629ac3-608b-43fa-89b1-ed416b1392ca","Type":"ContainerStarted","Data":"654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.045876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ff629ac3-608b-43fa-89b1-ed416b1392ca","Type":"ContainerStarted","Data":"989fa6e3fdd1fd3278ee40deb791fd8aa42d87a8771888ae7e396df70e3355a4"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.048914 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0660ab33-2747-4326-8901-542c807ca75d","Type":"ContainerStarted","Data":"45755657750e408ef6f018008d7c4072e4d556849bdb1af79c267402e4ff3a39"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.048976 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0660ab33-2747-4326-8901-542c807ca75d","Type":"ContainerStarted","Data":"07459e5999a046e9374328b292041849bf7e85d842d6c10e3e739f00ea0a1378"} Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.070747 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.070728365 podStartE2EDuration="2.070728365s" podCreationTimestamp="2026-03-14 09:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:29.063638426 +0000 UTC m=+2574.576330714" watchObservedRunningTime="2026-03-14 09:39:29.070728365 +0000 UTC m=+2574.583420633" Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.084012 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.083992321 podStartE2EDuration="2.083992321s" podCreationTimestamp="2026-03-14 09:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:29.081087387 +0000 UTC m=+2574.593779655" watchObservedRunningTime="2026-03-14 09:39:29.083992321 +0000 UTC m=+2574.596684589" Mar 14 09:39:29 crc kubenswrapper[4956]: I0314 09:39:29.117277 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.117259813 podStartE2EDuration="2.117259813s" podCreationTimestamp="2026-03-14 09:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:29.109820735 +0000 UTC m=+2574.622513023" watchObservedRunningTime="2026-03-14 09:39:29.117259813 +0000 UTC m=+2574.629952081" Mar 14 09:39:30 crc kubenswrapper[4956]: I0314 09:39:30.208991 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:39:31 crc kubenswrapper[4956]: I0314 09:39:31.068879 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"782b831bfebf5d160529d876878bf15ad46c206265675567c1df2cedcbdb4339"} Mar 14 09:39:31 crc kubenswrapper[4956]: I0314 09:39:31.168433 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:32 crc kubenswrapper[4956]: I0314 09:39:32.654378 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:32 crc kubenswrapper[4956]: I0314 09:39:32.691094 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:37 crc kubenswrapper[4956]: I0314 09:39:37.654702 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:37 crc kubenswrapper[4956]: I0314 09:39:37.662408 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:37 crc kubenswrapper[4956]: I0314 09:39:37.690432 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:37 crc kubenswrapper[4956]: I0314 09:39:37.719430 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:37 crc kubenswrapper[4956]: I0314 09:39:37.773698 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:37 crc kubenswrapper[4956]: I0314 09:39:37.797100 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:38 crc kubenswrapper[4956]: I0314 09:39:38.127560 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:38 crc kubenswrapper[4956]: I0314 09:39:38.132059 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:38 crc kubenswrapper[4956]: I0314 09:39:38.155342 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:38 crc kubenswrapper[4956]: I0314 09:39:38.156963 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:40 crc kubenswrapper[4956]: I0314 09:39:40.473373 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:40 crc kubenswrapper[4956]: I0314 09:39:40.474042 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-central-agent" containerID="cri-o://e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0" gracePeriod=30 Mar 14 09:39:40 crc kubenswrapper[4956]: I0314 09:39:40.474176 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="sg-core" containerID="cri-o://aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba" gracePeriod=30 Mar 14 09:39:40 crc kubenswrapper[4956]: I0314 09:39:40.474176 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="proxy-httpd" containerID="cri-o://790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089" gracePeriod=30 Mar 14 09:39:40 crc kubenswrapper[4956]: I0314 09:39:40.474211 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-notification-agent" containerID="cri-o://b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36" gracePeriod=30 Mar 14 09:39:40 crc kubenswrapper[4956]: I0314 09:39:40.483916 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": EOF" Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.155582 4956 generic.go:334] "Generic (PLEG): container finished" podID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerID="790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089" exitCode=0 Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.155620 4956 generic.go:334] "Generic (PLEG): container finished" podID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerID="aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba" exitCode=2 Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.155629 4956 generic.go:334] "Generic (PLEG): container finished" podID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerID="e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0" exitCode=0 Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.155651 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerDied","Data":"790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089"} Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.155688 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerDied","Data":"aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba"} Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.155698 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerDied","Data":"e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0"} Mar 14 09:39:41 crc kubenswrapper[4956]: I0314 09:39:41.254634 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": dial tcp 10.217.0.206:3000: connect: connection refused" Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.876031 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x286z"] Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.886182 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x286z"] Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.943432 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.943678 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="0660ab33-2747-4326-8901-542c807ca75d" containerName="watcher-decision-engine" containerID="cri-o://45755657750e408ef6f018008d7c4072e4d556849bdb1af79c267402e4ff3a39" gracePeriod=30 Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.951384 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher06fc-account-delete-zjl76"] Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.952539 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:43 crc kubenswrapper[4956]: I0314 09:39:43.963053 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher06fc-account-delete-zjl76"] Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.048511 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.048761 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-kuttl-api-log" containerID="cri-o://8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7" gracePeriod=30 Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.049583 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-api" containerID="cri-o://59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be" gracePeriod=30 Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.065509 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.065732 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="ff629ac3-608b-43fa-89b1-ed416b1392ca" containerName="watcher-applier" containerID="cri-o://654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca" gracePeriod=30 Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.084004 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2dht\" (UniqueName: \"kubernetes.io/projected/cc78739e-d989-428f-b01e-fc2a8ae8577a-kube-api-access-f2dht\") pod \"watcher06fc-account-delete-zjl76\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.084087 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc78739e-d989-428f-b01e-fc2a8ae8577a-operator-scripts\") pod \"watcher06fc-account-delete-zjl76\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.189400 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc78739e-d989-428f-b01e-fc2a8ae8577a-operator-scripts\") pod \"watcher06fc-account-delete-zjl76\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.189510 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2dht\" (UniqueName: \"kubernetes.io/projected/cc78739e-d989-428f-b01e-fc2a8ae8577a-kube-api-access-f2dht\") pod \"watcher06fc-account-delete-zjl76\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.190515 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc78739e-d989-428f-b01e-fc2a8ae8577a-operator-scripts\") pod \"watcher06fc-account-delete-zjl76\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.219166 4956 generic.go:334] "Generic (PLEG): container finished" podID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerID="8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7" exitCode=143 Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.219217 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"aee30952-18d5-4640-9f7b-6e219d65f30c","Type":"ContainerDied","Data":"8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7"} Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.238145 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2dht\" (UniqueName: \"kubernetes.io/projected/cc78739e-d989-428f-b01e-fc2a8ae8577a-kube-api-access-f2dht\") pod \"watcher06fc-account-delete-zjl76\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.273310 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:44 crc kubenswrapper[4956]: I0314 09:39:44.847157 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher06fc-account-delete-zjl76"] Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.122867 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.229739 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce9fb31-8246-44a1-9bd5-982abf68595d" path="/var/lib/kubelet/pods/2ce9fb31-8246-44a1-9bd5-982abf68595d/volumes" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.244364 4956 generic.go:334] "Generic (PLEG): container finished" podID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerID="b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36" exitCode=0 Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.244504 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerDied","Data":"b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36"} Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.244543 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dbdeee31-d317-47d5-a88b-81086f0da9ad","Type":"ContainerDied","Data":"ea69fbfef0ace36d0f422278f3b058f13144a5a99eb1b7203943e797ae016647"} Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.244570 4956 scope.go:117] "RemoveContainer" containerID="790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.244961 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.252442 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" event={"ID":"cc78739e-d989-428f-b01e-fc2a8ae8577a","Type":"ContainerStarted","Data":"e0840ba7ec5adfad0d3d833b4d5dacab1ecf9ac5b6f96c5448baeece09c17f9d"} Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.252654 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" event={"ID":"cc78739e-d989-428f-b01e-fc2a8ae8577a","Type":"ContainerStarted","Data":"b2b9dd4fba960fbb6e79e3612d206fdabc2eb24059d407a9f3249ce487a0da5f"} Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.272799 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" podStartSLOduration=2.272775324 podStartE2EDuration="2.272775324s" podCreationTimestamp="2026-03-14 09:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:45.267059149 +0000 UTC m=+2590.779751417" watchObservedRunningTime="2026-03-14 09:39:45.272775324 +0000 UTC m=+2590.785467592" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316149 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-log-httpd\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316237 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-scripts\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316287 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-ceilometer-tls-certs\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316401 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-run-httpd\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316422 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmksh\" (UniqueName: \"kubernetes.io/projected/dbdeee31-d317-47d5-a88b-81086f0da9ad-kube-api-access-cmksh\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316473 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-sg-core-conf-yaml\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316528 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-combined-ca-bundle\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.316563 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-config-data\") pod \"dbdeee31-d317-47d5-a88b-81086f0da9ad\" (UID: \"dbdeee31-d317-47d5-a88b-81086f0da9ad\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.318367 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.319707 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.322811 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.324415 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdeee31-d317-47d5-a88b-81086f0da9ad-kube-api-access-cmksh" (OuterVolumeSpecName: "kube-api-access-cmksh") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "kube-api-access-cmksh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.328188 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-scripts" (OuterVolumeSpecName: "scripts") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.360663 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.392523 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.408119 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.421263 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.421289 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.421299 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbdeee31-d317-47d5-a88b-81086f0da9ad-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.421308 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmksh\" (UniqueName: \"kubernetes.io/projected/dbdeee31-d317-47d5-a88b-81086f0da9ad-kube-api-access-cmksh\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.421316 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.421324 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.498723 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-config-data" (OuterVolumeSpecName: "config-data") pod "dbdeee31-d317-47d5-a88b-81086f0da9ad" (UID: "dbdeee31-d317-47d5-a88b-81086f0da9ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.522979 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdeee31-d317-47d5-a88b-81086f0da9ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.543732 4956 scope.go:117] "RemoveContainer" containerID="aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.596521 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.608473 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.617134 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.617564 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-notification-agent" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.617585 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-notification-agent" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.617604 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-central-agent" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.617612 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-central-agent" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.620115 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="proxy-httpd" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.620154 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="proxy-httpd" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.620175 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="sg-core" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.620183 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="sg-core" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.620557 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="proxy-httpd" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.620582 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-notification-agent" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.620598 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="sg-core" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.620613 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" containerName="ceilometer-central-agent" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.622186 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.626851 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.627312 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.628206 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.628315 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.628392 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-scripts\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.628503 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.628616 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwx4m\" (UniqueName: \"kubernetes.io/projected/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-kube-api-access-qwx4m\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.630058 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-log-httpd\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.630163 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-run-httpd\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.630245 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-config-data\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.630771 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.632915 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.634867 4956 scope.go:117] "RemoveContainer" containerID="b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.667541 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.682198 4956 scope.go:117] "RemoveContainer" containerID="e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.709191 4956 scope.go:117] "RemoveContainer" containerID="790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.709643 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089\": container with ID starting with 790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089 not found: ID does not exist" containerID="790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.710356 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089"} err="failed to get container status \"790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089\": rpc error: code = NotFound desc = could not find container \"790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089\": container with ID starting with 790ae61159a0102b51a406c5f83e4beefcd8cf7f418b1bf6d03aa5cbdee04089 not found: ID does not exist" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.710391 4956 scope.go:117] "RemoveContainer" containerID="aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.710970 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba\": container with ID starting with aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba not found: ID does not exist" containerID="aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.711044 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba"} err="failed to get container status \"aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba\": rpc error: code = NotFound desc = could not find container \"aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba\": container with ID starting with aa2207590be89a4a9ca8f9956c2c777517096d2f9d575b5b34c2a0e8aa762aba not found: ID does not exist" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.711074 4956 scope.go:117] "RemoveContainer" containerID="b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.711655 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36\": container with ID starting with b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36 not found: ID does not exist" containerID="b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.711683 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36"} err="failed to get container status \"b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36\": rpc error: code = NotFound desc = could not find container \"b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36\": container with ID starting with b708f3657f379b18652a41bcb08f3b90c238584016f6df9fbb265996b398aa36 not found: ID does not exist" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.711704 4956 scope.go:117] "RemoveContainer" containerID="e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0" Mar 14 09:39:45 crc kubenswrapper[4956]: E0314 09:39:45.711968 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0\": container with ID starting with e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0 not found: ID does not exist" containerID="e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.712000 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0"} err="failed to get container status \"e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0\": rpc error: code = NotFound desc = could not find container \"e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0\": container with ID starting with e6b2e2209e9c1a6fe276f6d43a9a80b3bee36c96566c2444066163e24530d6e0 not found: ID does not exist" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731049 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-cert-memcached-mtls\") pod \"aee30952-18d5-4640-9f7b-6e219d65f30c\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731108 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-combined-ca-bundle\") pod \"aee30952-18d5-4640-9f7b-6e219d65f30c\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731168 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-config-data\") pod \"aee30952-18d5-4640-9f7b-6e219d65f30c\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731195 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qkl4\" (UniqueName: \"kubernetes.io/projected/aee30952-18d5-4640-9f7b-6e219d65f30c-kube-api-access-7qkl4\") pod \"aee30952-18d5-4640-9f7b-6e219d65f30c\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731246 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aee30952-18d5-4640-9f7b-6e219d65f30c-logs\") pod \"aee30952-18d5-4640-9f7b-6e219d65f30c\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731306 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-custom-prometheus-ca\") pod \"aee30952-18d5-4640-9f7b-6e219d65f30c\" (UID: \"aee30952-18d5-4640-9f7b-6e219d65f30c\") " Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731543 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731571 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.731962 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-scripts\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732011 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732012 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee30952-18d5-4640-9f7b-6e219d65f30c-logs" (OuterVolumeSpecName: "logs") pod "aee30952-18d5-4640-9f7b-6e219d65f30c" (UID: "aee30952-18d5-4640-9f7b-6e219d65f30c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwx4m\" (UniqueName: \"kubernetes.io/projected/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-kube-api-access-qwx4m\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732117 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-log-httpd\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732138 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-run-httpd\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732161 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-config-data\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.732245 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aee30952-18d5-4640-9f7b-6e219d65f30c-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.733307 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-log-httpd\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.735670 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-run-httpd\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.740042 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-config-data\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.741085 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-scripts\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.741603 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.748874 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.753344 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.756168 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee30952-18d5-4640-9f7b-6e219d65f30c-kube-api-access-7qkl4" (OuterVolumeSpecName: "kube-api-access-7qkl4") pod "aee30952-18d5-4640-9f7b-6e219d65f30c" (UID: "aee30952-18d5-4640-9f7b-6e219d65f30c"). InnerVolumeSpecName "kube-api-access-7qkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.756342 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwx4m\" (UniqueName: \"kubernetes.io/projected/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-kube-api-access-qwx4m\") pod \"ceilometer-0\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.779636 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aee30952-18d5-4640-9f7b-6e219d65f30c" (UID: "aee30952-18d5-4640-9f7b-6e219d65f30c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.782598 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "aee30952-18d5-4640-9f7b-6e219d65f30c" (UID: "aee30952-18d5-4640-9f7b-6e219d65f30c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.828290 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-config-data" (OuterVolumeSpecName: "config-data") pod "aee30952-18d5-4640-9f7b-6e219d65f30c" (UID: "aee30952-18d5-4640-9f7b-6e219d65f30c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.833245 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.833286 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.833299 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.833311 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qkl4\" (UniqueName: \"kubernetes.io/projected/aee30952-18d5-4640-9f7b-6e219d65f30c-kube-api-access-7qkl4\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.848159 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "aee30952-18d5-4640-9f7b-6e219d65f30c" (UID: "aee30952-18d5-4640-9f7b-6e219d65f30c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.935645 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aee30952-18d5-4640-9f7b-6e219d65f30c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:45 crc kubenswrapper[4956]: I0314 09:39:45.947258 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.128875 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.142151 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff629ac3-608b-43fa-89b1-ed416b1392ca-logs\") pod \"ff629ac3-608b-43fa-89b1-ed416b1392ca\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.142221 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-config-data\") pod \"ff629ac3-608b-43fa-89b1-ed416b1392ca\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.142321 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-cert-memcached-mtls\") pod \"ff629ac3-608b-43fa-89b1-ed416b1392ca\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.142343 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn5hl\" (UniqueName: \"kubernetes.io/projected/ff629ac3-608b-43fa-89b1-ed416b1392ca-kube-api-access-nn5hl\") pod \"ff629ac3-608b-43fa-89b1-ed416b1392ca\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.142390 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-combined-ca-bundle\") pod \"ff629ac3-608b-43fa-89b1-ed416b1392ca\" (UID: \"ff629ac3-608b-43fa-89b1-ed416b1392ca\") " Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.143004 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff629ac3-608b-43fa-89b1-ed416b1392ca-logs" (OuterVolumeSpecName: "logs") pod "ff629ac3-608b-43fa-89b1-ed416b1392ca" (UID: "ff629ac3-608b-43fa-89b1-ed416b1392ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.152470 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff629ac3-608b-43fa-89b1-ed416b1392ca-kube-api-access-nn5hl" (OuterVolumeSpecName: "kube-api-access-nn5hl") pod "ff629ac3-608b-43fa-89b1-ed416b1392ca" (UID: "ff629ac3-608b-43fa-89b1-ed416b1392ca"). InnerVolumeSpecName "kube-api-access-nn5hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.190116 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff629ac3-608b-43fa-89b1-ed416b1392ca" (UID: "ff629ac3-608b-43fa-89b1-ed416b1392ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.219954 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-config-data" (OuterVolumeSpecName: "config-data") pod "ff629ac3-608b-43fa-89b1-ed416b1392ca" (UID: "ff629ac3-608b-43fa-89b1-ed416b1392ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.231968 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ff629ac3-608b-43fa-89b1-ed416b1392ca" (UID: "ff629ac3-608b-43fa-89b1-ed416b1392ca"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.244831 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.244870 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn5hl\" (UniqueName: \"kubernetes.io/projected/ff629ac3-608b-43fa-89b1-ed416b1392ca-kube-api-access-nn5hl\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.244880 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.244888 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff629ac3-608b-43fa-89b1-ed416b1392ca-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.244897 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff629ac3-608b-43fa-89b1-ed416b1392ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.250666 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.269036 4956 generic.go:334] "Generic (PLEG): container finished" podID="cc78739e-d989-428f-b01e-fc2a8ae8577a" containerID="e0840ba7ec5adfad0d3d833b4d5dacab1ecf9ac5b6f96c5448baeece09c17f9d" exitCode=0 Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.269101 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" event={"ID":"cc78739e-d989-428f-b01e-fc2a8ae8577a","Type":"ContainerDied","Data":"e0840ba7ec5adfad0d3d833b4d5dacab1ecf9ac5b6f96c5448baeece09c17f9d"} Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.274607 4956 generic.go:334] "Generic (PLEG): container finished" podID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerID="59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be" exitCode=0 Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.274682 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"aee30952-18d5-4640-9f7b-6e219d65f30c","Type":"ContainerDied","Data":"59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be"} Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.274700 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.274717 4956 scope.go:117] "RemoveContainer" containerID="59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.274705 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"aee30952-18d5-4640-9f7b-6e219d65f30c","Type":"ContainerDied","Data":"b661cdabcbbf67d30146b6c6851f6643c26648cf6ba422203f78f951cbeb6e56"} Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.279346 4956 generic.go:334] "Generic (PLEG): container finished" podID="ff629ac3-608b-43fa-89b1-ed416b1392ca" containerID="654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca" exitCode=0 Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.279400 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ff629ac3-608b-43fa-89b1-ed416b1392ca","Type":"ContainerDied","Data":"654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca"} Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.279421 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"ff629ac3-608b-43fa-89b1-ed416b1392ca","Type":"ContainerDied","Data":"989fa6e3fdd1fd3278ee40deb791fd8aa42d87a8771888ae7e396df70e3355a4"} Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.279471 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.289082 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerStarted","Data":"0f9566fd0a0309d0b49aca1208bbc7615c39967c074f2de6bf643807ead2c4c2"} Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.322638 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.325719 4956 scope.go:117] "RemoveContainer" containerID="8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.330436 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.339582 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.347265 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.348842 4956 scope.go:117] "RemoveContainer" containerID="59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be" Mar 14 09:39:46 crc kubenswrapper[4956]: E0314 09:39:46.349381 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be\": container with ID starting with 59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be not found: ID does not exist" containerID="59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.349407 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be"} err="failed to get container status \"59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be\": rpc error: code = NotFound desc = could not find container \"59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be\": container with ID starting with 59aace17a2f4736d1fbd3c5460cc874e1308e8430b468d2fab904ee3b33516be not found: ID does not exist" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.349428 4956 scope.go:117] "RemoveContainer" containerID="8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7" Mar 14 09:39:46 crc kubenswrapper[4956]: E0314 09:39:46.349824 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7\": container with ID starting with 8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7 not found: ID does not exist" containerID="8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.349844 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7"} err="failed to get container status \"8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7\": rpc error: code = NotFound desc = could not find container \"8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7\": container with ID starting with 8eff293c3aa871e2c99740151015181976c6eac159876bc1dd70a8f69a26c9f7 not found: ID does not exist" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.349859 4956 scope.go:117] "RemoveContainer" containerID="654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.371201 4956 scope.go:117] "RemoveContainer" containerID="654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca" Mar 14 09:39:46 crc kubenswrapper[4956]: E0314 09:39:46.371772 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca\": container with ID starting with 654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca not found: ID does not exist" containerID="654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.371827 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca"} err="failed to get container status \"654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca\": rpc error: code = NotFound desc = could not find container \"654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca\": container with ID starting with 654e525f6ff086befd423c98d30e665719f830c22264ecc8e4b7de216a8695ca not found: ID does not exist" Mar 14 09:39:46 crc kubenswrapper[4956]: I0314 09:39:46.842740 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.218962 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" path="/var/lib/kubelet/pods/aee30952-18d5-4640-9f7b-6e219d65f30c/volumes" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.219878 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdeee31-d317-47d5-a88b-81086f0da9ad" path="/var/lib/kubelet/pods/dbdeee31-d317-47d5-a88b-81086f0da9ad/volumes" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.221623 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff629ac3-608b-43fa-89b1-ed416b1392ca" path="/var/lib/kubelet/pods/ff629ac3-608b-43fa-89b1-ed416b1392ca/volumes" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.305069 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerStarted","Data":"8c96bea07202ab4f08d3726834a97a6d63b8507976c6e5fb757e142f3c7599dc"} Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.654596 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.768141 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2dht\" (UniqueName: \"kubernetes.io/projected/cc78739e-d989-428f-b01e-fc2a8ae8577a-kube-api-access-f2dht\") pod \"cc78739e-d989-428f-b01e-fc2a8ae8577a\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.768316 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc78739e-d989-428f-b01e-fc2a8ae8577a-operator-scripts\") pod \"cc78739e-d989-428f-b01e-fc2a8ae8577a\" (UID: \"cc78739e-d989-428f-b01e-fc2a8ae8577a\") " Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.769186 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc78739e-d989-428f-b01e-fc2a8ae8577a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc78739e-d989-428f-b01e-fc2a8ae8577a" (UID: "cc78739e-d989-428f-b01e-fc2a8ae8577a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.772680 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc78739e-d989-428f-b01e-fc2a8ae8577a-kube-api-access-f2dht" (OuterVolumeSpecName: "kube-api-access-f2dht") pod "cc78739e-d989-428f-b01e-fc2a8ae8577a" (UID: "cc78739e-d989-428f-b01e-fc2a8ae8577a"). InnerVolumeSpecName "kube-api-access-f2dht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.870714 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc78739e-d989-428f-b01e-fc2a8ae8577a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:47 crc kubenswrapper[4956]: I0314 09:39:47.870998 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2dht\" (UniqueName: \"kubernetes.io/projected/cc78739e-d989-428f-b01e-fc2a8ae8577a-kube-api-access-f2dht\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.348103 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerStarted","Data":"d794da0669cf87d346e1649003b7240d35ad4775076d8809790228f00cd1fcc6"} Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.351678 4956 generic.go:334] "Generic (PLEG): container finished" podID="0660ab33-2747-4326-8901-542c807ca75d" containerID="45755657750e408ef6f018008d7c4072e4d556849bdb1af79c267402e4ff3a39" exitCode=0 Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.351784 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0660ab33-2747-4326-8901-542c807ca75d","Type":"ContainerDied","Data":"45755657750e408ef6f018008d7c4072e4d556849bdb1af79c267402e4ff3a39"} Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.351845 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0660ab33-2747-4326-8901-542c807ca75d","Type":"ContainerDied","Data":"07459e5999a046e9374328b292041849bf7e85d842d6c10e3e739f00ea0a1378"} Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.351861 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07459e5999a046e9374328b292041849bf7e85d842d6c10e3e739f00ea0a1378" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.363771 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" event={"ID":"cc78739e-d989-428f-b01e-fc2a8ae8577a","Type":"ContainerDied","Data":"b2b9dd4fba960fbb6e79e3612d206fdabc2eb24059d407a9f3249ce487a0da5f"} Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.363811 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2b9dd4fba960fbb6e79e3612d206fdabc2eb24059d407a9f3249ce487a0da5f" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.363872 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher06fc-account-delete-zjl76" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.404001 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.496188 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-custom-prometheus-ca\") pod \"0660ab33-2747-4326-8901-542c807ca75d\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.496245 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-combined-ca-bundle\") pod \"0660ab33-2747-4326-8901-542c807ca75d\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.496288 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx2gk\" (UniqueName: \"kubernetes.io/projected/0660ab33-2747-4326-8901-542c807ca75d-kube-api-access-jx2gk\") pod \"0660ab33-2747-4326-8901-542c807ca75d\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.496431 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0660ab33-2747-4326-8901-542c807ca75d-logs\") pod \"0660ab33-2747-4326-8901-542c807ca75d\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.496528 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-cert-memcached-mtls\") pod \"0660ab33-2747-4326-8901-542c807ca75d\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.496568 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-config-data\") pod \"0660ab33-2747-4326-8901-542c807ca75d\" (UID: \"0660ab33-2747-4326-8901-542c807ca75d\") " Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.497580 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0660ab33-2747-4326-8901-542c807ca75d-logs" (OuterVolumeSpecName: "logs") pod "0660ab33-2747-4326-8901-542c807ca75d" (UID: "0660ab33-2747-4326-8901-542c807ca75d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.513912 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0660ab33-2747-4326-8901-542c807ca75d-kube-api-access-jx2gk" (OuterVolumeSpecName: "kube-api-access-jx2gk") pod "0660ab33-2747-4326-8901-542c807ca75d" (UID: "0660ab33-2747-4326-8901-542c807ca75d"). InnerVolumeSpecName "kube-api-access-jx2gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.532601 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0660ab33-2747-4326-8901-542c807ca75d" (UID: "0660ab33-2747-4326-8901-542c807ca75d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.550083 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0660ab33-2747-4326-8901-542c807ca75d" (UID: "0660ab33-2747-4326-8901-542c807ca75d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.558068 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-config-data" (OuterVolumeSpecName: "config-data") pod "0660ab33-2747-4326-8901-542c807ca75d" (UID: "0660ab33-2747-4326-8901-542c807ca75d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.596555 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "0660ab33-2747-4326-8901-542c807ca75d" (UID: "0660ab33-2747-4326-8901-542c807ca75d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.601348 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.601385 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.601405 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx2gk\" (UniqueName: \"kubernetes.io/projected/0660ab33-2747-4326-8901-542c807ca75d-kube-api-access-jx2gk\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.601421 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0660ab33-2747-4326-8901-542c807ca75d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.601435 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:48 crc kubenswrapper[4956]: I0314 09:39:48.601451 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0660ab33-2747-4326-8901-542c807ca75d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.007721 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-mp7mc"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.021951 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher06fc-account-delete-zjl76"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.027812 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-06fc-account-create-update-kvddn"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.034200 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-mp7mc"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.040428 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher06fc-account-delete-zjl76"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.047128 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-06fc-account-create-update-kvddn"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.219441 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fb877a-5cef-4754-be35-f7bd08a21b07" path="/var/lib/kubelet/pods/65fb877a-5cef-4754-be35-f7bd08a21b07/volumes" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.220158 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674a1847-6823-446f-9784-88f3bb7055b9" path="/var/lib/kubelet/pods/674a1847-6823-446f-9784-88f3bb7055b9/volumes" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.220696 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc78739e-d989-428f-b01e-fc2a8ae8577a" path="/var/lib/kubelet/pods/cc78739e-d989-428f-b01e-fc2a8ae8577a/volumes" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.374561 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerStarted","Data":"f4fe492e570dc32fd84f83fe630f15b22301e6f5e9664d7f8ba359f7b574a723"} Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.374604 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.406610 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.414345 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.986534 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-659h8"] Mar 14 09:39:49 crc kubenswrapper[4956]: E0314 09:39:49.986944 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc78739e-d989-428f-b01e-fc2a8ae8577a" containerName="mariadb-account-delete" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.986963 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc78739e-d989-428f-b01e-fc2a8ae8577a" containerName="mariadb-account-delete" Mar 14 09:39:49 crc kubenswrapper[4956]: E0314 09:39:49.986978 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-api" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.986984 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-api" Mar 14 09:39:49 crc kubenswrapper[4956]: E0314 09:39:49.987011 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff629ac3-608b-43fa-89b1-ed416b1392ca" containerName="watcher-applier" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987017 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff629ac3-608b-43fa-89b1-ed416b1392ca" containerName="watcher-applier" Mar 14 09:39:49 crc kubenswrapper[4956]: E0314 09:39:49.987029 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0660ab33-2747-4326-8901-542c807ca75d" containerName="watcher-decision-engine" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987035 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0660ab33-2747-4326-8901-542c807ca75d" containerName="watcher-decision-engine" Mar 14 09:39:49 crc kubenswrapper[4956]: E0314 09:39:49.987045 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-kuttl-api-log" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987051 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-kuttl-api-log" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987220 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-api" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987233 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0660ab33-2747-4326-8901-542c807ca75d" containerName="watcher-decision-engine" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987243 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee30952-18d5-4640-9f7b-6e219d65f30c" containerName="watcher-kuttl-api-log" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987252 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff629ac3-608b-43fa-89b1-ed416b1392ca" containerName="watcher-applier" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987261 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc78739e-d989-428f-b01e-fc2a8ae8577a" containerName="mariadb-account-delete" Mar 14 09:39:49 crc kubenswrapper[4956]: I0314 09:39:49.987851 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.004199 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-659h8"] Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.024220 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/729edbbf-6306-4988-98f5-894516eee11f-operator-scripts\") pod \"watcher-db-create-659h8\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.024473 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prb8m\" (UniqueName: \"kubernetes.io/projected/729edbbf-6306-4988-98f5-894516eee11f-kube-api-access-prb8m\") pod \"watcher-db-create-659h8\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.117901 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm"] Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.119724 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm"] Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.119946 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.123068 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.126238 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prb8m\" (UniqueName: \"kubernetes.io/projected/729edbbf-6306-4988-98f5-894516eee11f-kube-api-access-prb8m\") pod \"watcher-db-create-659h8\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.126353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/729edbbf-6306-4988-98f5-894516eee11f-operator-scripts\") pod \"watcher-db-create-659h8\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.127744 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/729edbbf-6306-4988-98f5-894516eee11f-operator-scripts\") pod \"watcher-db-create-659h8\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.149052 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prb8m\" (UniqueName: \"kubernetes.io/projected/729edbbf-6306-4988-98f5-894516eee11f-kube-api-access-prb8m\") pod \"watcher-db-create-659h8\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.229067 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e175020b-d63c-4bdb-b605-78838ddb4f3f-operator-scripts\") pod \"watcher-db6f-account-create-update-7bmvm\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.229385 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfzx\" (UniqueName: \"kubernetes.io/projected/e175020b-d63c-4bdb-b605-78838ddb4f3f-kube-api-access-hlfzx\") pod \"watcher-db6f-account-create-update-7bmvm\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.307755 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.330382 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfzx\" (UniqueName: \"kubernetes.io/projected/e175020b-d63c-4bdb-b605-78838ddb4f3f-kube-api-access-hlfzx\") pod \"watcher-db6f-account-create-update-7bmvm\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.330530 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e175020b-d63c-4bdb-b605-78838ddb4f3f-operator-scripts\") pod \"watcher-db6f-account-create-update-7bmvm\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.331211 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e175020b-d63c-4bdb-b605-78838ddb4f3f-operator-scripts\") pod \"watcher-db6f-account-create-update-7bmvm\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.349276 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfzx\" (UniqueName: \"kubernetes.io/projected/e175020b-d63c-4bdb-b605-78838ddb4f3f-kube-api-access-hlfzx\") pod \"watcher-db6f-account-create-update-7bmvm\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.387831 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerStarted","Data":"022deb1830c2782aa3395483d3c83d3db219bff216e8bb3472fcf7797da8c1a1"} Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.388081 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-central-agent" containerID="cri-o://8c96bea07202ab4f08d3726834a97a6d63b8507976c6e5fb757e142f3c7599dc" gracePeriod=30 Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.388896 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.389373 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="sg-core" containerID="cri-o://f4fe492e570dc32fd84f83fe630f15b22301e6f5e9664d7f8ba359f7b574a723" gracePeriod=30 Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.389463 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="proxy-httpd" containerID="cri-o://022deb1830c2782aa3395483d3c83d3db219bff216e8bb3472fcf7797da8c1a1" gracePeriod=30 Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.389521 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-notification-agent" containerID="cri-o://d794da0669cf87d346e1649003b7240d35ad4775076d8809790228f00cd1fcc6" gracePeriod=30 Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.423685 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.570109361 podStartE2EDuration="5.423670418s" podCreationTimestamp="2026-03-14 09:39:45 +0000 UTC" firstStartedPulling="2026-03-14 09:39:46.257011806 +0000 UTC m=+2591.769704074" lastFinishedPulling="2026-03-14 09:39:50.110572863 +0000 UTC m=+2595.623265131" observedRunningTime="2026-03-14 09:39:50.423009481 +0000 UTC m=+2595.935701749" watchObservedRunningTime="2026-03-14 09:39:50.423670418 +0000 UTC m=+2595.936362676" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.563348 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:50 crc kubenswrapper[4956]: I0314 09:39:50.791410 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-659h8"] Mar 14 09:39:50 crc kubenswrapper[4956]: W0314 09:39:50.797870 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod729edbbf_6306_4988_98f5_894516eee11f.slice/crio-bb205061faf7a57fb6ab8a13bcfe4447b53ad5bc7d7a5b7a2308eb6d7acff8e3 WatchSource:0}: Error finding container bb205061faf7a57fb6ab8a13bcfe4447b53ad5bc7d7a5b7a2308eb6d7acff8e3: Status 404 returned error can't find the container with id bb205061faf7a57fb6ab8a13bcfe4447b53ad5bc7d7a5b7a2308eb6d7acff8e3 Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.035833 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm"] Mar 14 09:39:51 crc kubenswrapper[4956]: W0314 09:39:51.037254 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode175020b_d63c_4bdb_b605_78838ddb4f3f.slice/crio-681e9e114d2c2fe4288260c4cf86b7fb1799feddb3ae825fa9608db0b056a73b WatchSource:0}: Error finding container 681e9e114d2c2fe4288260c4cf86b7fb1799feddb3ae825fa9608db0b056a73b: Status 404 returned error can't find the container with id 681e9e114d2c2fe4288260c4cf86b7fb1799feddb3ae825fa9608db0b056a73b Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.220919 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0660ab33-2747-4326-8901-542c807ca75d" path="/var/lib/kubelet/pods/0660ab33-2747-4326-8901-542c807ca75d/volumes" Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.398094 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" event={"ID":"e175020b-d63c-4bdb-b605-78838ddb4f3f","Type":"ContainerStarted","Data":"f1e9183ff1f8807da241d845cb7718f58740fbdbfef022c5be5d726cad64f39d"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.398163 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" event={"ID":"e175020b-d63c-4bdb-b605-78838ddb4f3f","Type":"ContainerStarted","Data":"681e9e114d2c2fe4288260c4cf86b7fb1799feddb3ae825fa9608db0b056a73b"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.400834 4956 generic.go:334] "Generic (PLEG): container finished" podID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerID="f4fe492e570dc32fd84f83fe630f15b22301e6f5e9664d7f8ba359f7b574a723" exitCode=2 Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.400862 4956 generic.go:334] "Generic (PLEG): container finished" podID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerID="d794da0669cf87d346e1649003b7240d35ad4775076d8809790228f00cd1fcc6" exitCode=0 Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.400871 4956 generic.go:334] "Generic (PLEG): container finished" podID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerID="8c96bea07202ab4f08d3726834a97a6d63b8507976c6e5fb757e142f3c7599dc" exitCode=0 Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.400911 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerDied","Data":"f4fe492e570dc32fd84f83fe630f15b22301e6f5e9664d7f8ba359f7b574a723"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.400969 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerDied","Data":"d794da0669cf87d346e1649003b7240d35ad4775076d8809790228f00cd1fcc6"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.400986 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerDied","Data":"8c96bea07202ab4f08d3726834a97a6d63b8507976c6e5fb757e142f3c7599dc"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.403341 4956 generic.go:334] "Generic (PLEG): container finished" podID="729edbbf-6306-4988-98f5-894516eee11f" containerID="b41f3f37b6aee35504110174cec3dcd32758f8a3ee7135728a5f899caa3cf3a9" exitCode=0 Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.403407 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-659h8" event={"ID":"729edbbf-6306-4988-98f5-894516eee11f","Type":"ContainerDied","Data":"b41f3f37b6aee35504110174cec3dcd32758f8a3ee7135728a5f899caa3cf3a9"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.403450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-659h8" event={"ID":"729edbbf-6306-4988-98f5-894516eee11f","Type":"ContainerStarted","Data":"bb205061faf7a57fb6ab8a13bcfe4447b53ad5bc7d7a5b7a2308eb6d7acff8e3"} Mar 14 09:39:51 crc kubenswrapper[4956]: I0314 09:39:51.429886 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" podStartSLOduration=1.429862634 podStartE2EDuration="1.429862634s" podCreationTimestamp="2026-03-14 09:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:51.421433601 +0000 UTC m=+2596.934125879" watchObservedRunningTime="2026-03-14 09:39:51.429862634 +0000 UTC m=+2596.942554902" Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.414772 4956 generic.go:334] "Generic (PLEG): container finished" podID="e175020b-d63c-4bdb-b605-78838ddb4f3f" containerID="f1e9183ff1f8807da241d845cb7718f58740fbdbfef022c5be5d726cad64f39d" exitCode=0 Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.414881 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" event={"ID":"e175020b-d63c-4bdb-b605-78838ddb4f3f","Type":"ContainerDied","Data":"f1e9183ff1f8807da241d845cb7718f58740fbdbfef022c5be5d726cad64f39d"} Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.758979 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.873208 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prb8m\" (UniqueName: \"kubernetes.io/projected/729edbbf-6306-4988-98f5-894516eee11f-kube-api-access-prb8m\") pod \"729edbbf-6306-4988-98f5-894516eee11f\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.873606 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/729edbbf-6306-4988-98f5-894516eee11f-operator-scripts\") pod \"729edbbf-6306-4988-98f5-894516eee11f\" (UID: \"729edbbf-6306-4988-98f5-894516eee11f\") " Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.874059 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729edbbf-6306-4988-98f5-894516eee11f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "729edbbf-6306-4988-98f5-894516eee11f" (UID: "729edbbf-6306-4988-98f5-894516eee11f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.874471 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/729edbbf-6306-4988-98f5-894516eee11f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.878745 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729edbbf-6306-4988-98f5-894516eee11f-kube-api-access-prb8m" (OuterVolumeSpecName: "kube-api-access-prb8m") pod "729edbbf-6306-4988-98f5-894516eee11f" (UID: "729edbbf-6306-4988-98f5-894516eee11f"). InnerVolumeSpecName "kube-api-access-prb8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:52 crc kubenswrapper[4956]: I0314 09:39:52.976043 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prb8m\" (UniqueName: \"kubernetes.io/projected/729edbbf-6306-4988-98f5-894516eee11f-kube-api-access-prb8m\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.422905 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-659h8" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.422901 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-659h8" event={"ID":"729edbbf-6306-4988-98f5-894516eee11f","Type":"ContainerDied","Data":"bb205061faf7a57fb6ab8a13bcfe4447b53ad5bc7d7a5b7a2308eb6d7acff8e3"} Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.423710 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb205061faf7a57fb6ab8a13bcfe4447b53ad5bc7d7a5b7a2308eb6d7acff8e3" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.757230 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.892821 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e175020b-d63c-4bdb-b605-78838ddb4f3f-operator-scripts\") pod \"e175020b-d63c-4bdb-b605-78838ddb4f3f\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.892990 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfzx\" (UniqueName: \"kubernetes.io/projected/e175020b-d63c-4bdb-b605-78838ddb4f3f-kube-api-access-hlfzx\") pod \"e175020b-d63c-4bdb-b605-78838ddb4f3f\" (UID: \"e175020b-d63c-4bdb-b605-78838ddb4f3f\") " Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.894234 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e175020b-d63c-4bdb-b605-78838ddb4f3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e175020b-d63c-4bdb-b605-78838ddb4f3f" (UID: "e175020b-d63c-4bdb-b605-78838ddb4f3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.896889 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e175020b-d63c-4bdb-b605-78838ddb4f3f-kube-api-access-hlfzx" (OuterVolumeSpecName: "kube-api-access-hlfzx") pod "e175020b-d63c-4bdb-b605-78838ddb4f3f" (UID: "e175020b-d63c-4bdb-b605-78838ddb4f3f"). InnerVolumeSpecName "kube-api-access-hlfzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.995345 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e175020b-d63c-4bdb-b605-78838ddb4f3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:53 crc kubenswrapper[4956]: I0314 09:39:53.995374 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfzx\" (UniqueName: \"kubernetes.io/projected/e175020b-d63c-4bdb-b605-78838ddb4f3f-kube-api-access-hlfzx\") on node \"crc\" DevicePath \"\"" Mar 14 09:39:54 crc kubenswrapper[4956]: I0314 09:39:54.433667 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" event={"ID":"e175020b-d63c-4bdb-b605-78838ddb4f3f","Type":"ContainerDied","Data":"681e9e114d2c2fe4288260c4cf86b7fb1799feddb3ae825fa9608db0b056a73b"} Mar 14 09:39:54 crc kubenswrapper[4956]: I0314 09:39:54.433710 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm" Mar 14 09:39:54 crc kubenswrapper[4956]: I0314 09:39:54.433719 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="681e9e114d2c2fe4288260c4cf86b7fb1799feddb3ae825fa9608db0b056a73b" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.324904 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-scmdh"] Mar 14 09:39:55 crc kubenswrapper[4956]: E0314 09:39:55.325256 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e175020b-d63c-4bdb-b605-78838ddb4f3f" containerName="mariadb-account-create-update" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.325273 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e175020b-d63c-4bdb-b605-78838ddb4f3f" containerName="mariadb-account-create-update" Mar 14 09:39:55 crc kubenswrapper[4956]: E0314 09:39:55.325311 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729edbbf-6306-4988-98f5-894516eee11f" containerName="mariadb-database-create" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.325318 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="729edbbf-6306-4988-98f5-894516eee11f" containerName="mariadb-database-create" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.325454 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="729edbbf-6306-4988-98f5-894516eee11f" containerName="mariadb-database-create" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.325472 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e175020b-d63c-4bdb-b605-78838ddb4f3f" containerName="mariadb-account-create-update" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.326079 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.328211 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.328511 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-6cfrf" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.340832 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-scmdh"] Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.518437 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-db-sync-config-data\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.518506 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-config-data\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.518529 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.518628 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxx5\" (UniqueName: \"kubernetes.io/projected/2c685dcf-5096-423c-96b3-fc765fefbe57-kube-api-access-6vxx5\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.620985 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxx5\" (UniqueName: \"kubernetes.io/projected/2c685dcf-5096-423c-96b3-fc765fefbe57-kube-api-access-6vxx5\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.621154 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-db-sync-config-data\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.621195 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-config-data\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.621228 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.627047 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.627343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-config-data\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.630890 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-db-sync-config-data\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.645895 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxx5\" (UniqueName: \"kubernetes.io/projected/2c685dcf-5096-423c-96b3-fc765fefbe57-kube-api-access-6vxx5\") pod \"watcher-kuttl-db-sync-scmdh\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:55 crc kubenswrapper[4956]: I0314 09:39:55.941704 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:39:56 crc kubenswrapper[4956]: I0314 09:39:56.078021 4956 scope.go:117] "RemoveContainer" containerID="ae57f6725505eacf9f5408536b02d8b676bdd128d6b309b38b91fbd3642252be" Mar 14 09:39:56 crc kubenswrapper[4956]: I0314 09:39:56.108514 4956 scope.go:117] "RemoveContainer" containerID="078d669d2df4819508d75c4da8d573b742cbd2f99baf478951163df58e6899ae" Mar 14 09:39:56 crc kubenswrapper[4956]: I0314 09:39:56.148189 4956 scope.go:117] "RemoveContainer" containerID="f755e9990317c101293bd7339f4aa8537a85570cc4ad08d2b73a66a238165aad" Mar 14 09:39:56 crc kubenswrapper[4956]: I0314 09:39:56.436439 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-scmdh"] Mar 14 09:39:56 crc kubenswrapper[4956]: I0314 09:39:56.452640 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" event={"ID":"2c685dcf-5096-423c-96b3-fc765fefbe57","Type":"ContainerStarted","Data":"cc256e36caaad8edcbdc7b5df0bbcbe70979181e374f00e0e07520afe9e66ab7"} Mar 14 09:39:57 crc kubenswrapper[4956]: I0314 09:39:57.471165 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" event={"ID":"2c685dcf-5096-423c-96b3-fc765fefbe57","Type":"ContainerStarted","Data":"64b35f687efc8d5a793a1cb1bf645014be3d132c644f91aa6e9356e8c8b6ad25"} Mar 14 09:39:57 crc kubenswrapper[4956]: I0314 09:39:57.494701 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" podStartSLOduration=2.494676311 podStartE2EDuration="2.494676311s" podCreationTimestamp="2026-03-14 09:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:39:57.484822871 +0000 UTC m=+2602.997515169" watchObservedRunningTime="2026-03-14 09:39:57.494676311 +0000 UTC m=+2603.007368579" Mar 14 09:39:59 crc kubenswrapper[4956]: I0314 09:39:59.490960 4956 generic.go:334] "Generic (PLEG): container finished" podID="2c685dcf-5096-423c-96b3-fc765fefbe57" containerID="64b35f687efc8d5a793a1cb1bf645014be3d132c644f91aa6e9356e8c8b6ad25" exitCode=0 Mar 14 09:39:59 crc kubenswrapper[4956]: I0314 09:39:59.491239 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" event={"ID":"2c685dcf-5096-423c-96b3-fc765fefbe57","Type":"ContainerDied","Data":"64b35f687efc8d5a793a1cb1bf645014be3d132c644f91aa6e9356e8c8b6ad25"} Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.130997 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558020-jbvd9"] Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.132333 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.139136 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.139203 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.139217 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.142634 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-jbvd9"] Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.222424 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flt45\" (UniqueName: \"kubernetes.io/projected/5002f91d-89aa-4683-bd3f-118a91b6459c-kube-api-access-flt45\") pod \"auto-csr-approver-29558020-jbvd9\" (UID: \"5002f91d-89aa-4683-bd3f-118a91b6459c\") " pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.324715 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flt45\" (UniqueName: \"kubernetes.io/projected/5002f91d-89aa-4683-bd3f-118a91b6459c-kube-api-access-flt45\") pod \"auto-csr-approver-29558020-jbvd9\" (UID: \"5002f91d-89aa-4683-bd3f-118a91b6459c\") " pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.350557 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flt45\" (UniqueName: \"kubernetes.io/projected/5002f91d-89aa-4683-bd3f-118a91b6459c-kube-api-access-flt45\") pod \"auto-csr-approver-29558020-jbvd9\" (UID: \"5002f91d-89aa-4683-bd3f-118a91b6459c\") " pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:00 crc kubenswrapper[4956]: I0314 09:40:00.456885 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.022594 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.137430 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-jbvd9"] Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.137970 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vxx5\" (UniqueName: \"kubernetes.io/projected/2c685dcf-5096-423c-96b3-fc765fefbe57-kube-api-access-6vxx5\") pod \"2c685dcf-5096-423c-96b3-fc765fefbe57\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.138098 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-db-sync-config-data\") pod \"2c685dcf-5096-423c-96b3-fc765fefbe57\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.138132 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-config-data\") pod \"2c685dcf-5096-423c-96b3-fc765fefbe57\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.138167 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-combined-ca-bundle\") pod \"2c685dcf-5096-423c-96b3-fc765fefbe57\" (UID: \"2c685dcf-5096-423c-96b3-fc765fefbe57\") " Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.143627 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2c685dcf-5096-423c-96b3-fc765fefbe57" (UID: "2c685dcf-5096-423c-96b3-fc765fefbe57"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.146735 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c685dcf-5096-423c-96b3-fc765fefbe57-kube-api-access-6vxx5" (OuterVolumeSpecName: "kube-api-access-6vxx5") pod "2c685dcf-5096-423c-96b3-fc765fefbe57" (UID: "2c685dcf-5096-423c-96b3-fc765fefbe57"). InnerVolumeSpecName "kube-api-access-6vxx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.163630 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c685dcf-5096-423c-96b3-fc765fefbe57" (UID: "2c685dcf-5096-423c-96b3-fc765fefbe57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.183361 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-config-data" (OuterVolumeSpecName: "config-data") pod "2c685dcf-5096-423c-96b3-fc765fefbe57" (UID: "2c685dcf-5096-423c-96b3-fc765fefbe57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.244230 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vxx5\" (UniqueName: \"kubernetes.io/projected/2c685dcf-5096-423c-96b3-fc765fefbe57-kube-api-access-6vxx5\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.244276 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.244290 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.244303 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c685dcf-5096-423c-96b3-fc765fefbe57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.528244 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" event={"ID":"2c685dcf-5096-423c-96b3-fc765fefbe57","Type":"ContainerDied","Data":"cc256e36caaad8edcbdc7b5df0bbcbe70979181e374f00e0e07520afe9e66ab7"} Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.528289 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc256e36caaad8edcbdc7b5df0bbcbe70979181e374f00e0e07520afe9e66ab7" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.528604 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-scmdh" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.529299 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" event={"ID":"5002f91d-89aa-4683-bd3f-118a91b6459c","Type":"ContainerStarted","Data":"4de34e77af92824934b4a5ac6bfe5b395d2d16ec945eafe8f80bed8a5750e52e"} Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.766159 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:01 crc kubenswrapper[4956]: E0314 09:40:01.766631 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c685dcf-5096-423c-96b3-fc765fefbe57" containerName="watcher-kuttl-db-sync" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.766654 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c685dcf-5096-423c-96b3-fc765fefbe57" containerName="watcher-kuttl-db-sync" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.766896 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c685dcf-5096-423c-96b3-fc765fefbe57" containerName="watcher-kuttl-db-sync" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.768144 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.771390 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.771475 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-6cfrf" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.781364 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.853741 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.853983 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw47\" (UniqueName: \"kubernetes.io/projected/6035407b-e061-4ae2-998c-cd55b06f781f-kube-api-access-bpw47\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.854009 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.854030 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6035407b-e061-4ae2-998c-cd55b06f781f-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.854059 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.854148 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.855886 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.857568 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.861125 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.889319 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.890428 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.892137 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.897060 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.914451 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956266 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956338 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw47\" (UniqueName: \"kubernetes.io/projected/6035407b-e061-4ae2-998c-cd55b06f781f-kube-api-access-bpw47\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956407 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6035407b-e061-4ae2-998c-cd55b06f781f-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956441 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3156e33d-b397-4cc7-a8e1-1aa640478902-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956499 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956527 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.956567 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957192 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0100c485-2f3c-4cbf-ac49-566c008facb5-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957228 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4zs\" (UniqueName: \"kubernetes.io/projected/0100c485-2f3c-4cbf-ac49-566c008facb5-kube-api-access-cd4zs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957333 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957400 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957411 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6035407b-e061-4ae2-998c-cd55b06f781f-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957621 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.957659 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.958639 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpgrp\" (UniqueName: \"kubernetes.io/projected/3156e33d-b397-4cc7-a8e1-1aa640478902-kube-api-access-dpgrp\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.958698 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.958721 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.960691 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.960764 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.971001 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.971104 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:01 crc kubenswrapper[4956]: I0314 09:40:01.973198 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw47\" (UniqueName: \"kubernetes.io/projected/6035407b-e061-4ae2-998c-cd55b06f781f-kube-api-access-bpw47\") pod \"watcher-kuttl-api-0\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061458 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061560 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061582 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061613 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpgrp\" (UniqueName: \"kubernetes.io/projected/3156e33d-b397-4cc7-a8e1-1aa640478902-kube-api-access-dpgrp\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061654 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061706 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3156e33d-b397-4cc7-a8e1-1aa640478902-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061750 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061797 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061816 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0100c485-2f3c-4cbf-ac49-566c008facb5-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061835 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4zs\" (UniqueName: \"kubernetes.io/projected/0100c485-2f3c-4cbf-ac49-566c008facb5-kube-api-access-cd4zs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.061850 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.062878 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3156e33d-b397-4cc7-a8e1-1aa640478902-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.065372 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.065556 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.065885 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.066030 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0100c485-2f3c-4cbf-ac49-566c008facb5-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.067111 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.067779 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.068914 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.069047 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.087367 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpgrp\" (UniqueName: \"kubernetes.io/projected/3156e33d-b397-4cc7-a8e1-1aa640478902-kube-api-access-dpgrp\") pod \"watcher-kuttl-applier-0\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.092849 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.094668 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4zs\" (UniqueName: \"kubernetes.io/projected/0100c485-2f3c-4cbf-ac49-566c008facb5-kube-api-access-cd4zs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.176899 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.209112 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.565276 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.680086 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:02 crc kubenswrapper[4956]: W0314 09:40:02.687494 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3156e33d_b397_4cc7_a8e1_1aa640478902.slice/crio-8fed6d9e5bb5f0b6e6674a2fcf2ab9ab0ca319f50e9475ca45b1930671942589 WatchSource:0}: Error finding container 8fed6d9e5bb5f0b6e6674a2fcf2ab9ab0ca319f50e9475ca45b1930671942589: Status 404 returned error can't find the container with id 8fed6d9e5bb5f0b6e6674a2fcf2ab9ab0ca319f50e9475ca45b1930671942589 Mar 14 09:40:02 crc kubenswrapper[4956]: W0314 09:40:02.790102 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0100c485_2f3c_4cbf_ac49_566c008facb5.slice/crio-24af4f10a901927dcee441232a1bf07970c2def6608a41b9e70db10dce1b4557 WatchSource:0}: Error finding container 24af4f10a901927dcee441232a1bf07970c2def6608a41b9e70db10dce1b4557: Status 404 returned error can't find the container with id 24af4f10a901927dcee441232a1bf07970c2def6608a41b9e70db10dce1b4557 Mar 14 09:40:02 crc kubenswrapper[4956]: I0314 09:40:02.793729 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.548025 4956 generic.go:334] "Generic (PLEG): container finished" podID="5002f91d-89aa-4683-bd3f-118a91b6459c" containerID="7b6468e56288278161e95b0db81979778da89c96a845c69ceac10eb420ef803f" exitCode=0 Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.548085 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" event={"ID":"5002f91d-89aa-4683-bd3f-118a91b6459c","Type":"ContainerDied","Data":"7b6468e56288278161e95b0db81979778da89c96a845c69ceac10eb420ef803f"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.550686 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0100c485-2f3c-4cbf-ac49-566c008facb5","Type":"ContainerStarted","Data":"3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.550724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0100c485-2f3c-4cbf-ac49-566c008facb5","Type":"ContainerStarted","Data":"24af4f10a901927dcee441232a1bf07970c2def6608a41b9e70db10dce1b4557"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.553084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"3156e33d-b397-4cc7-a8e1-1aa640478902","Type":"ContainerStarted","Data":"c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.553116 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"3156e33d-b397-4cc7-a8e1-1aa640478902","Type":"ContainerStarted","Data":"8fed6d9e5bb5f0b6e6674a2fcf2ab9ab0ca319f50e9475ca45b1930671942589"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.554821 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6035407b-e061-4ae2-998c-cd55b06f781f","Type":"ContainerStarted","Data":"4dddd0a791ab0c2de072120a147e86d4ae6bf8e03dbc69ad2044ac270f09d233"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.554851 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6035407b-e061-4ae2-998c-cd55b06f781f","Type":"ContainerStarted","Data":"e5d94ec0f23a64e98f0675f322403c9554aa649c8aeca67814d5a9fa57b4b7ac"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.554888 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6035407b-e061-4ae2-998c-cd55b06f781f","Type":"ContainerStarted","Data":"266c2e7b94cb31597a278958ed11a83e4e662eee7bf126923a32544f534a8ffb"} Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.557328 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.703355 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.703327157 podStartE2EDuration="2.703327157s" podCreationTimestamp="2026-03-14 09:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:03.636882405 +0000 UTC m=+2609.149574673" watchObservedRunningTime="2026-03-14 09:40:03.703327157 +0000 UTC m=+2609.216019425" Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.708245 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.708231621 podStartE2EDuration="2.708231621s" podCreationTimestamp="2026-03-14 09:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:03.70580584 +0000 UTC m=+2609.218498098" watchObservedRunningTime="2026-03-14 09:40:03.708231621 +0000 UTC m=+2609.220923889" Mar 14 09:40:03 crc kubenswrapper[4956]: I0314 09:40:03.748676 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.748657014 podStartE2EDuration="2.748657014s" podCreationTimestamp="2026-03-14 09:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:03.742865838 +0000 UTC m=+2609.255558106" watchObservedRunningTime="2026-03-14 09:40:03.748657014 +0000 UTC m=+2609.261349282" Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.000343 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.036693 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flt45\" (UniqueName: \"kubernetes.io/projected/5002f91d-89aa-4683-bd3f-118a91b6459c-kube-api-access-flt45\") pod \"5002f91d-89aa-4683-bd3f-118a91b6459c\" (UID: \"5002f91d-89aa-4683-bd3f-118a91b6459c\") " Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.058278 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5002f91d-89aa-4683-bd3f-118a91b6459c-kube-api-access-flt45" (OuterVolumeSpecName: "kube-api-access-flt45") pod "5002f91d-89aa-4683-bd3f-118a91b6459c" (UID: "5002f91d-89aa-4683-bd3f-118a91b6459c"). InnerVolumeSpecName "kube-api-access-flt45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.139334 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flt45\" (UniqueName: \"kubernetes.io/projected/5002f91d-89aa-4683-bd3f-118a91b6459c-kube-api-access-flt45\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.578596 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.578633 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.578670 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-jbvd9" event={"ID":"5002f91d-89aa-4683-bd3f-118a91b6459c","Type":"ContainerDied","Data":"4de34e77af92824934b4a5ac6bfe5b395d2d16ec945eafe8f80bed8a5750e52e"} Mar 14 09:40:05 crc kubenswrapper[4956]: I0314 09:40:05.578727 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de34e77af92824934b4a5ac6bfe5b395d2d16ec945eafe8f80bed8a5750e52e" Mar 14 09:40:06 crc kubenswrapper[4956]: I0314 09:40:06.038450 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:06 crc kubenswrapper[4956]: I0314 09:40:06.056347 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-bkvpt"] Mar 14 09:40:06 crc kubenswrapper[4956]: I0314 09:40:06.062240 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-bkvpt"] Mar 14 09:40:07 crc kubenswrapper[4956]: I0314 09:40:07.093603 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:07 crc kubenswrapper[4956]: I0314 09:40:07.177972 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:07 crc kubenswrapper[4956]: I0314 09:40:07.219680 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8250654a-de53-49bc-91c8-bda36567251c" path="/var/lib/kubelet/pods/8250654a-de53-49bc-91c8-bda36567251c/volumes" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.094079 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.099891 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.177498 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.209828 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.210333 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.250477 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.640160 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.644226 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.666762 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:12 crc kubenswrapper[4956]: I0314 09:40:12.666849 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:15 crc kubenswrapper[4956]: I0314 09:40:15.968814 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 09:40:20 crc kubenswrapper[4956]: I0314 09:40:20.713681 4956 generic.go:334] "Generic (PLEG): container finished" podID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerID="022deb1830c2782aa3395483d3c83d3db219bff216e8bb3472fcf7797da8c1a1" exitCode=137 Mar 14 09:40:20 crc kubenswrapper[4956]: I0314 09:40:20.713749 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerDied","Data":"022deb1830c2782aa3395483d3c83d3db219bff216e8bb3472fcf7797da8c1a1"} Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.375663 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422587 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-run-httpd\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422634 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-sg-core-conf-yaml\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-scripts\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422726 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-log-httpd\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422760 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwx4m\" (UniqueName: \"kubernetes.io/projected/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-kube-api-access-qwx4m\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422853 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-config-data\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.422933 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-ceilometer-tls-certs\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.423021 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-combined-ca-bundle\") pod \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\" (UID: \"77f83d67-2555-4e0e-b8d9-f772b4ddea3a\") " Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.423285 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.423448 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.423656 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.431970 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-scripts" (OuterVolumeSpecName: "scripts") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.433720 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-kube-api-access-qwx4m" (OuterVolumeSpecName: "kube-api-access-qwx4m") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "kube-api-access-qwx4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.450868 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.467360 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.488041 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.511268 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-config-data" (OuterVolumeSpecName: "config-data") pod "77f83d67-2555-4e0e-b8d9-f772b4ddea3a" (UID: "77f83d67-2555-4e0e-b8d9-f772b4ddea3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.526971 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.527011 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.527021 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.527032 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.527042 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.527051 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwx4m\" (UniqueName: \"kubernetes.io/projected/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-kube-api-access-qwx4m\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.527060 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77f83d67-2555-4e0e-b8d9-f772b4ddea3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.729395 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77f83d67-2555-4e0e-b8d9-f772b4ddea3a","Type":"ContainerDied","Data":"0f9566fd0a0309d0b49aca1208bbc7615c39967c074f2de6bf643807ead2c4c2"} Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.729543 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.729749 4956 scope.go:117] "RemoveContainer" containerID="022deb1830c2782aa3395483d3c83d3db219bff216e8bb3472fcf7797da8c1a1" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.753064 4956 scope.go:117] "RemoveContainer" containerID="f4fe492e570dc32fd84f83fe630f15b22301e6f5e9664d7f8ba359f7b574a723" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.771498 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.784274 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.786218 4956 scope.go:117] "RemoveContainer" containerID="d794da0669cf87d346e1649003b7240d35ad4775076d8809790228f00cd1fcc6" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.805026 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:21 crc kubenswrapper[4956]: E0314 09:40:21.805443 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="proxy-httpd" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.805462 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="proxy-httpd" Mar 14 09:40:21 crc kubenswrapper[4956]: E0314 09:40:21.805499 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5002f91d-89aa-4683-bd3f-118a91b6459c" containerName="oc" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.805505 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5002f91d-89aa-4683-bd3f-118a91b6459c" containerName="oc" Mar 14 09:40:21 crc kubenswrapper[4956]: E0314 09:40:21.805529 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-notification-agent" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.805535 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-notification-agent" Mar 14 09:40:21 crc kubenswrapper[4956]: E0314 09:40:21.805550 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-central-agent" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.805556 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-central-agent" Mar 14 09:40:21 crc kubenswrapper[4956]: E0314 09:40:21.805871 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="sg-core" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.805881 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="sg-core" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.806039 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-notification-agent" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.806051 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="proxy-httpd" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.806064 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="sg-core" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.806072 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5002f91d-89aa-4683-bd3f-118a91b6459c" containerName="oc" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.806082 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" containerName="ceilometer-central-agent" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.807774 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.812095 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.812804 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.813076 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.822528 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.826198 4956 scope.go:117] "RemoveContainer" containerID="8c96bea07202ab4f08d3726834a97a6d63b8507976c6e5fb757e142f3c7599dc" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.933819 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-config-data\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.933874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-log-httpd\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.933894 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.933917 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-scripts\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.934055 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kkb\" (UniqueName: \"kubernetes.io/projected/e49ee504-2434-4b11-8b9c-baf52f4654df-kube-api-access-79kkb\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.934225 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.934284 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:21 crc kubenswrapper[4956]: I0314 09:40:21.934611 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-run-httpd\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036308 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-config-data\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036361 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-log-httpd\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036382 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036399 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-scripts\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036431 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kkb\" (UniqueName: \"kubernetes.io/projected/e49ee504-2434-4b11-8b9c-baf52f4654df-kube-api-access-79kkb\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036464 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036498 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.036914 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-run-httpd\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.037136 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-log-httpd\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.037493 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-run-httpd\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.041110 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.041242 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-scripts\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.041975 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.047642 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.054388 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-config-data\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.056302 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kkb\" (UniqueName: \"kubernetes.io/projected/e49ee504-2434-4b11-8b9c-baf52f4654df-kube-api-access-79kkb\") pod \"ceilometer-0\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.128393 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.549679 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:22 crc kubenswrapper[4956]: W0314 09:40:22.554710 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode49ee504_2434_4b11_8b9c_baf52f4654df.slice/crio-66aee52defedfe85545f527a965c156381322c2a6d8535d97138e88b719cc14d WatchSource:0}: Error finding container 66aee52defedfe85545f527a965c156381322c2a6d8535d97138e88b719cc14d: Status 404 returned error can't find the container with id 66aee52defedfe85545f527a965c156381322c2a6d8535d97138e88b719cc14d Mar 14 09:40:22 crc kubenswrapper[4956]: I0314 09:40:22.740153 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerStarted","Data":"66aee52defedfe85545f527a965c156381322c2a6d8535d97138e88b719cc14d"} Mar 14 09:40:23 crc kubenswrapper[4956]: I0314 09:40:23.219760 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f83d67-2555-4e0e-b8d9-f772b4ddea3a" path="/var/lib/kubelet/pods/77f83d67-2555-4e0e-b8d9-f772b4ddea3a/volumes" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.405169 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-scmdh"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.420176 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-scmdh"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.462996 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherdb6f-account-delete-sl6nq"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.464108 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.479794 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherdb6f-account-delete-sl6nq"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.490305 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.490642 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="0100c485-2f3c-4cbf-ac49-566c008facb5" containerName="watcher-decision-engine" containerID="cri-o://3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" gracePeriod=30 Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.534124 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.534476 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-kuttl-api-log" containerID="cri-o://e5d94ec0f23a64e98f0675f322403c9554aa649c8aeca67814d5a9fa57b4b7ac" gracePeriod=30 Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.534674 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-api" containerID="cri-o://4dddd0a791ab0c2de072120a147e86d4ae6bf8e03dbc69ad2044ac270f09d233" gracePeriod=30 Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.586338 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995eaef5-e453-402f-8c9a-36653d25e6d3-operator-scripts\") pod \"watcherdb6f-account-delete-sl6nq\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.586439 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcw8s\" (UniqueName: \"kubernetes.io/projected/995eaef5-e453-402f-8c9a-36653d25e6d3-kube-api-access-wcw8s\") pod \"watcherdb6f-account-delete-sl6nq\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.594742 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.595022 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="3156e33d-b397-4cc7-a8e1-1aa640478902" containerName="watcher-applier" containerID="cri-o://c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" gracePeriod=30 Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.688573 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995eaef5-e453-402f-8c9a-36653d25e6d3-operator-scripts\") pod \"watcherdb6f-account-delete-sl6nq\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.688634 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcw8s\" (UniqueName: \"kubernetes.io/projected/995eaef5-e453-402f-8c9a-36653d25e6d3-kube-api-access-wcw8s\") pod \"watcherdb6f-account-delete-sl6nq\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.689658 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995eaef5-e453-402f-8c9a-36653d25e6d3-operator-scripts\") pod \"watcherdb6f-account-delete-sl6nq\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.724203 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcw8s\" (UniqueName: \"kubernetes.io/projected/995eaef5-e453-402f-8c9a-36653d25e6d3-kube-api-access-wcw8s\") pod \"watcherdb6f-account-delete-sl6nq\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.762825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerStarted","Data":"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3"} Mar 14 09:40:24 crc kubenswrapper[4956]: I0314 09:40:24.800454 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.228610 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c685dcf-5096-423c-96b3-fc765fefbe57" path="/var/lib/kubelet/pods/2c685dcf-5096-423c-96b3-fc765fefbe57/volumes" Mar 14 09:40:25 crc kubenswrapper[4956]: W0314 09:40:25.262561 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod995eaef5_e453_402f_8c9a_36653d25e6d3.slice/crio-a012a0aec688710d93ca3274b9d45af541ff1c23ad60f0866c2ca94208b22e5a WatchSource:0}: Error finding container a012a0aec688710d93ca3274b9d45af541ff1c23ad60f0866c2ca94208b22e5a: Status 404 returned error can't find the container with id a012a0aec688710d93ca3274b9d45af541ff1c23ad60f0866c2ca94208b22e5a Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.263390 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherdb6f-account-delete-sl6nq"] Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.777163 4956 generic.go:334] "Generic (PLEG): container finished" podID="6035407b-e061-4ae2-998c-cd55b06f781f" containerID="4dddd0a791ab0c2de072120a147e86d4ae6bf8e03dbc69ad2044ac270f09d233" exitCode=0 Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.777506 4956 generic.go:334] "Generic (PLEG): container finished" podID="6035407b-e061-4ae2-998c-cd55b06f781f" containerID="e5d94ec0f23a64e98f0675f322403c9554aa649c8aeca67814d5a9fa57b4b7ac" exitCode=143 Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.777557 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6035407b-e061-4ae2-998c-cd55b06f781f","Type":"ContainerDied","Data":"4dddd0a791ab0c2de072120a147e86d4ae6bf8e03dbc69ad2044ac270f09d233"} Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.777591 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6035407b-e061-4ae2-998c-cd55b06f781f","Type":"ContainerDied","Data":"e5d94ec0f23a64e98f0675f322403c9554aa649c8aeca67814d5a9fa57b4b7ac"} Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.790402 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" event={"ID":"995eaef5-e453-402f-8c9a-36653d25e6d3","Type":"ContainerStarted","Data":"a1623505805caebc13b8d8fcee88b6e3251419e9e8f0f6b76d5704fab9553286"} Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.790456 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" event={"ID":"995eaef5-e453-402f-8c9a-36653d25e6d3","Type":"ContainerStarted","Data":"a012a0aec688710d93ca3274b9d45af541ff1c23ad60f0866c2ca94208b22e5a"} Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.794289 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerStarted","Data":"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd"} Mar 14 09:40:25 crc kubenswrapper[4956]: I0314 09:40:25.809875 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" podStartSLOduration=1.809849062 podStartE2EDuration="1.809849062s" podCreationTimestamp="2026-03-14 09:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:25.80580564 +0000 UTC m=+2631.318497908" watchObservedRunningTime="2026-03-14 09:40:25.809849062 +0000 UTC m=+2631.322541340" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.227827 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.320579 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-config-data\") pod \"6035407b-e061-4ae2-998c-cd55b06f781f\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.320665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6035407b-e061-4ae2-998c-cd55b06f781f-logs\") pod \"6035407b-e061-4ae2-998c-cd55b06f781f\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.320698 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-combined-ca-bundle\") pod \"6035407b-e061-4ae2-998c-cd55b06f781f\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.320779 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-cert-memcached-mtls\") pod \"6035407b-e061-4ae2-998c-cd55b06f781f\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.320810 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpw47\" (UniqueName: \"kubernetes.io/projected/6035407b-e061-4ae2-998c-cd55b06f781f-kube-api-access-bpw47\") pod \"6035407b-e061-4ae2-998c-cd55b06f781f\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.320899 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-custom-prometheus-ca\") pod \"6035407b-e061-4ae2-998c-cd55b06f781f\" (UID: \"6035407b-e061-4ae2-998c-cd55b06f781f\") " Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.321556 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6035407b-e061-4ae2-998c-cd55b06f781f-logs" (OuterVolumeSpecName: "logs") pod "6035407b-e061-4ae2-998c-cd55b06f781f" (UID: "6035407b-e061-4ae2-998c-cd55b06f781f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.326838 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6035407b-e061-4ae2-998c-cd55b06f781f-kube-api-access-bpw47" (OuterVolumeSpecName: "kube-api-access-bpw47") pod "6035407b-e061-4ae2-998c-cd55b06f781f" (UID: "6035407b-e061-4ae2-998c-cd55b06f781f"). InnerVolumeSpecName "kube-api-access-bpw47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.353065 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6035407b-e061-4ae2-998c-cd55b06f781f" (UID: "6035407b-e061-4ae2-998c-cd55b06f781f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.361263 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6035407b-e061-4ae2-998c-cd55b06f781f" (UID: "6035407b-e061-4ae2-998c-cd55b06f781f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.396254 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-config-data" (OuterVolumeSpecName: "config-data") pod "6035407b-e061-4ae2-998c-cd55b06f781f" (UID: "6035407b-e061-4ae2-998c-cd55b06f781f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.416292 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "6035407b-e061-4ae2-998c-cd55b06f781f" (UID: "6035407b-e061-4ae2-998c-cd55b06f781f"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.424734 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.424778 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.424791 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6035407b-e061-4ae2-998c-cd55b06f781f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.424801 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.424814 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6035407b-e061-4ae2-998c-cd55b06f781f-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.424826 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpw47\" (UniqueName: \"kubernetes.io/projected/6035407b-e061-4ae2-998c-cd55b06f781f-kube-api-access-bpw47\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.805293 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerStarted","Data":"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a"} Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.807933 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.815371 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6035407b-e061-4ae2-998c-cd55b06f781f","Type":"ContainerDied","Data":"266c2e7b94cb31597a278958ed11a83e4e662eee7bf126923a32544f534a8ffb"} Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.815446 4956 scope.go:117] "RemoveContainer" containerID="4dddd0a791ab0c2de072120a147e86d4ae6bf8e03dbc69ad2044ac270f09d233" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.817883 4956 generic.go:334] "Generic (PLEG): container finished" podID="995eaef5-e453-402f-8c9a-36653d25e6d3" containerID="a1623505805caebc13b8d8fcee88b6e3251419e9e8f0f6b76d5704fab9553286" exitCode=0 Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.818030 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" event={"ID":"995eaef5-e453-402f-8c9a-36653d25e6d3","Type":"ContainerDied","Data":"a1623505805caebc13b8d8fcee88b6e3251419e9e8f0f6b76d5704fab9553286"} Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.863589 4956 scope.go:117] "RemoveContainer" containerID="e5d94ec0f23a64e98f0675f322403c9554aa649c8aeca67814d5a9fa57b4b7ac" Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.887362 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:26 crc kubenswrapper[4956]: I0314 09:40:26.942640 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:27 crc kubenswrapper[4956]: E0314 09:40:27.304545 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:40:27 crc kubenswrapper[4956]: E0314 09:40:27.308096 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:40:27 crc kubenswrapper[4956]: I0314 09:40:27.310881 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" path="/var/lib/kubelet/pods/6035407b-e061-4ae2-998c-cd55b06f781f/volumes" Mar 14 09:40:27 crc kubenswrapper[4956]: I0314 09:40:27.311519 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:27 crc kubenswrapper[4956]: E0314 09:40:27.312381 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 09:40:27 crc kubenswrapper[4956]: E0314 09:40:27.312444 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="3156e33d-b397-4cc7-a8e1-1aa640478902" containerName="watcher-applier" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.336259 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.349108 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcw8s\" (UniqueName: \"kubernetes.io/projected/995eaef5-e453-402f-8c9a-36653d25e6d3-kube-api-access-wcw8s\") pod \"995eaef5-e453-402f-8c9a-36653d25e6d3\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.349271 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995eaef5-e453-402f-8c9a-36653d25e6d3-operator-scripts\") pod \"995eaef5-e453-402f-8c9a-36653d25e6d3\" (UID: \"995eaef5-e453-402f-8c9a-36653d25e6d3\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.349947 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995eaef5-e453-402f-8c9a-36653d25e6d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "995eaef5-e453-402f-8c9a-36653d25e6d3" (UID: "995eaef5-e453-402f-8c9a-36653d25e6d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.362747 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995eaef5-e453-402f-8c9a-36653d25e6d3-kube-api-access-wcw8s" (OuterVolumeSpecName: "kube-api-access-wcw8s") pod "995eaef5-e453-402f-8c9a-36653d25e6d3" (UID: "995eaef5-e453-402f-8c9a-36653d25e6d3"). InnerVolumeSpecName "kube-api-access-wcw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.450746 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcw8s\" (UniqueName: \"kubernetes.io/projected/995eaef5-e453-402f-8c9a-36653d25e6d3-kube-api-access-wcw8s\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.450788 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/995eaef5-e453-402f-8c9a-36653d25e6d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.636142 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.755995 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-cert-memcached-mtls\") pod \"3156e33d-b397-4cc7-a8e1-1aa640478902\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.756418 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-combined-ca-bundle\") pod \"3156e33d-b397-4cc7-a8e1-1aa640478902\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.757088 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpgrp\" (UniqueName: \"kubernetes.io/projected/3156e33d-b397-4cc7-a8e1-1aa640478902-kube-api-access-dpgrp\") pod \"3156e33d-b397-4cc7-a8e1-1aa640478902\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.757127 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-config-data\") pod \"3156e33d-b397-4cc7-a8e1-1aa640478902\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.757162 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3156e33d-b397-4cc7-a8e1-1aa640478902-logs\") pod \"3156e33d-b397-4cc7-a8e1-1aa640478902\" (UID: \"3156e33d-b397-4cc7-a8e1-1aa640478902\") " Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.757589 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3156e33d-b397-4cc7-a8e1-1aa640478902-logs" (OuterVolumeSpecName: "logs") pod "3156e33d-b397-4cc7-a8e1-1aa640478902" (UID: "3156e33d-b397-4cc7-a8e1-1aa640478902"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.757732 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3156e33d-b397-4cc7-a8e1-1aa640478902-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.766275 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3156e33d-b397-4cc7-a8e1-1aa640478902-kube-api-access-dpgrp" (OuterVolumeSpecName: "kube-api-access-dpgrp") pod "3156e33d-b397-4cc7-a8e1-1aa640478902" (UID: "3156e33d-b397-4cc7-a8e1-1aa640478902"). InnerVolumeSpecName "kube-api-access-dpgrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.786213 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3156e33d-b397-4cc7-a8e1-1aa640478902" (UID: "3156e33d-b397-4cc7-a8e1-1aa640478902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.814539 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-config-data" (OuterVolumeSpecName: "config-data") pod "3156e33d-b397-4cc7-a8e1-1aa640478902" (UID: "3156e33d-b397-4cc7-a8e1-1aa640478902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.830430 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "3156e33d-b397-4cc7-a8e1-1aa640478902" (UID: "3156e33d-b397-4cc7-a8e1-1aa640478902"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.848600 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" event={"ID":"995eaef5-e453-402f-8c9a-36653d25e6d3","Type":"ContainerDied","Data":"a012a0aec688710d93ca3274b9d45af541ff1c23ad60f0866c2ca94208b22e5a"} Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.848667 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a012a0aec688710d93ca3274b9d45af541ff1c23ad60f0866c2ca94208b22e5a" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.848755 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherdb6f-account-delete-sl6nq" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.854798 4956 generic.go:334] "Generic (PLEG): container finished" podID="3156e33d-b397-4cc7-a8e1-1aa640478902" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" exitCode=0 Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.854904 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"3156e33d-b397-4cc7-a8e1-1aa640478902","Type":"ContainerDied","Data":"c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336"} Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.854961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"3156e33d-b397-4cc7-a8e1-1aa640478902","Type":"ContainerDied","Data":"8fed6d9e5bb5f0b6e6674a2fcf2ab9ab0ca319f50e9475ca45b1930671942589"} Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.854987 4956 scope.go:117] "RemoveContainer" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.855672 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.860365 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.860405 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpgrp\" (UniqueName: \"kubernetes.io/projected/3156e33d-b397-4cc7-a8e1-1aa640478902-kube-api-access-dpgrp\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.860420 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.860434 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3156e33d-b397-4cc7-a8e1-1aa640478902-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.885232 4956 scope.go:117] "RemoveContainer" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" Mar 14 09:40:29 crc kubenswrapper[4956]: E0314 09:40:29.886151 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336\": container with ID starting with c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336 not found: ID does not exist" containerID="c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.886268 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336"} err="failed to get container status \"c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336\": rpc error: code = NotFound desc = could not find container \"c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336\": container with ID starting with c0bc917189e842e65890f24b51c698db38d33b7cfa555b0e10bb5cf71508a336 not found: ID does not exist" Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.889410 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:29 crc kubenswrapper[4956]: I0314 09:40:29.900737 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.865931 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerStarted","Data":"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04"} Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.866068 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-central-agent" containerID="cri-o://949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" gracePeriod=30 Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.866105 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="sg-core" containerID="cri-o://4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" gracePeriod=30 Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.866099 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="proxy-httpd" containerID="cri-o://9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" gracePeriod=30 Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.866168 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-notification-agent" containerID="cri-o://b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" gracePeriod=30 Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.868404 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:30 crc kubenswrapper[4956]: I0314 09:40:30.899784 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.283721683 podStartE2EDuration="9.899751191s" podCreationTimestamp="2026-03-14 09:40:21 +0000 UTC" firstStartedPulling="2026-03-14 09:40:22.557094042 +0000 UTC m=+2628.069786310" lastFinishedPulling="2026-03-14 09:40:30.17312355 +0000 UTC m=+2635.685815818" observedRunningTime="2026-03-14 09:40:30.889173303 +0000 UTC m=+2636.401865611" watchObservedRunningTime="2026-03-14 09:40:30.899751191 +0000 UTC m=+2636.412443459" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.217764 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3156e33d-b397-4cc7-a8e1-1aa640478902" path="/var/lib/kubelet/pods/3156e33d-b397-4cc7-a8e1-1aa640478902/volumes" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.603559 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791512 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-scripts\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791564 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-sg-core-conf-yaml\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-ceilometer-tls-certs\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791685 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-combined-ca-bundle\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791705 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-run-httpd\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791742 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-config-data\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791829 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79kkb\" (UniqueName: \"kubernetes.io/projected/e49ee504-2434-4b11-8b9c-baf52f4654df-kube-api-access-79kkb\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.791871 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-log-httpd\") pod \"e49ee504-2434-4b11-8b9c-baf52f4654df\" (UID: \"e49ee504-2434-4b11-8b9c-baf52f4654df\") " Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.792498 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.792641 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.796681 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-scripts" (OuterVolumeSpecName: "scripts") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.796888 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49ee504-2434-4b11-8b9c-baf52f4654df-kube-api-access-79kkb" (OuterVolumeSpecName: "kube-api-access-79kkb") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "kube-api-access-79kkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.821475 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.835147 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.855158 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.872162 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-config-data" (OuterVolumeSpecName: "config-data") pod "e49ee504-2434-4b11-8b9c-baf52f4654df" (UID: "e49ee504-2434-4b11-8b9c-baf52f4654df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878724 4956 generic.go:334] "Generic (PLEG): container finished" podID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" exitCode=0 Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878779 4956 generic.go:334] "Generic (PLEG): container finished" podID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" exitCode=2 Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878788 4956 generic.go:334] "Generic (PLEG): container finished" podID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" exitCode=0 Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878842 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878828 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerDied","Data":"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04"} Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878890 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerDied","Data":"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a"} Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878902 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerDied","Data":"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd"} Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878911 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerDied","Data":"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3"} Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878927 4956 scope.go:117] "RemoveContainer" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878959 4956 generic.go:334] "Generic (PLEG): container finished" podID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" exitCode=0 Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.878984 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e49ee504-2434-4b11-8b9c-baf52f4654df","Type":"ContainerDied","Data":"66aee52defedfe85545f527a965c156381322c2a6d8535d97138e88b719cc14d"} Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.894967 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895000 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895013 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895022 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895037 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79kkb\" (UniqueName: \"kubernetes.io/projected/e49ee504-2434-4b11-8b9c-baf52f4654df-kube-api-access-79kkb\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895048 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e49ee504-2434-4b11-8b9c-baf52f4654df-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895057 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.895070 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e49ee504-2434-4b11-8b9c-baf52f4654df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.899745 4956 scope.go:117] "RemoveContainer" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.924113 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.930871 4956 scope.go:117] "RemoveContainer" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.931843 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.957888 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958407 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-api" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958427 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-api" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958440 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="proxy-httpd" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958448 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="proxy-httpd" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958461 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-central-agent" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958468 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-central-agent" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958510 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-kuttl-api-log" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958519 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-kuttl-api-log" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958533 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995eaef5-e453-402f-8c9a-36653d25e6d3" containerName="mariadb-account-delete" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958541 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="995eaef5-e453-402f-8c9a-36653d25e6d3" containerName="mariadb-account-delete" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958560 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-notification-agent" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958567 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-notification-agent" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958580 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="sg-core" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958587 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="sg-core" Mar 14 09:40:31 crc kubenswrapper[4956]: E0314 09:40:31.958598 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3156e33d-b397-4cc7-a8e1-1aa640478902" containerName="watcher-applier" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958605 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3156e33d-b397-4cc7-a8e1-1aa640478902" containerName="watcher-applier" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958801 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="sg-core" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958814 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="proxy-httpd" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958830 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-kuttl-api-log" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958842 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-central-agent" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958858 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6035407b-e061-4ae2-998c-cd55b06f781f" containerName="watcher-api" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958871 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3156e33d-b397-4cc7-a8e1-1aa640478902" containerName="watcher-applier" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958881 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" containerName="ceilometer-notification-agent" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.958896 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="995eaef5-e453-402f-8c9a-36653d25e6d3" containerName="mariadb-account-delete" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.960585 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.963596 4956 scope.go:117] "RemoveContainer" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.963818 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.964037 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.965996 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:40:31 crc kubenswrapper[4956]: I0314 09:40:31.968314 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.045003 4956 scope.go:117] "RemoveContainer" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.045516 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": container with ID starting with 9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04 not found: ID does not exist" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.045563 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04"} err="failed to get container status \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": rpc error: code = NotFound desc = could not find container \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": container with ID starting with 9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.045593 4956 scope.go:117] "RemoveContainer" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.046248 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": container with ID starting with 4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a not found: ID does not exist" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.046284 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a"} err="failed to get container status \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": rpc error: code = NotFound desc = could not find container \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": container with ID starting with 4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.046308 4956 scope.go:117] "RemoveContainer" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.046597 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": container with ID starting with b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd not found: ID does not exist" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.046624 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd"} err="failed to get container status \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": rpc error: code = NotFound desc = could not find container \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": container with ID starting with b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.046644 4956 scope.go:117] "RemoveContainer" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.046957 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": container with ID starting with 949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3 not found: ID does not exist" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.046978 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3"} err="failed to get container status \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": rpc error: code = NotFound desc = could not find container \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": container with ID starting with 949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.046993 4956 scope.go:117] "RemoveContainer" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.047282 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04"} err="failed to get container status \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": rpc error: code = NotFound desc = could not find container \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": container with ID starting with 9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.047308 4956 scope.go:117] "RemoveContainer" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.047555 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a"} err="failed to get container status \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": rpc error: code = NotFound desc = could not find container \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": container with ID starting with 4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.047574 4956 scope.go:117] "RemoveContainer" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.047837 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd"} err="failed to get container status \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": rpc error: code = NotFound desc = could not find container \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": container with ID starting with b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.047857 4956 scope.go:117] "RemoveContainer" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048118 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3"} err="failed to get container status \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": rpc error: code = NotFound desc = could not find container \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": container with ID starting with 949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048135 4956 scope.go:117] "RemoveContainer" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048389 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04"} err="failed to get container status \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": rpc error: code = NotFound desc = could not find container \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": container with ID starting with 9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048407 4956 scope.go:117] "RemoveContainer" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048670 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a"} err="failed to get container status \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": rpc error: code = NotFound desc = could not find container \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": container with ID starting with 4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048689 4956 scope.go:117] "RemoveContainer" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048954 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd"} err="failed to get container status \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": rpc error: code = NotFound desc = could not find container \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": container with ID starting with b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.048977 4956 scope.go:117] "RemoveContainer" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.049238 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3"} err="failed to get container status \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": rpc error: code = NotFound desc = could not find container \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": container with ID starting with 949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.049259 4956 scope.go:117] "RemoveContainer" containerID="9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.050147 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04"} err="failed to get container status \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": rpc error: code = NotFound desc = could not find container \"9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04\": container with ID starting with 9a611073799182c2b24561eb5a402a546d39198204e90277aca6eff025d31f04 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.050169 4956 scope.go:117] "RemoveContainer" containerID="4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.050937 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a"} err="failed to get container status \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": rpc error: code = NotFound desc = could not find container \"4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a\": container with ID starting with 4e603b3a88a41ce691967d80d738feb06760f55b68402c27848fac333291449a not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.050957 4956 scope.go:117] "RemoveContainer" containerID="b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.051243 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd"} err="failed to get container status \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": rpc error: code = NotFound desc = could not find container \"b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd\": container with ID starting with b13c75da4db9badd5f65e376b3cbfb8e451aec85deb4ba18abcee7f99ae6f6dd not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.051278 4956 scope.go:117] "RemoveContainer" containerID="949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.052120 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3"} err="failed to get container status \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": rpc error: code = NotFound desc = could not find container \"949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3\": container with ID starting with 949a8836ca5404ca978680ad97911ecbf8c6cc27a8b09b117ce9f5bb6922e2c3 not found: ID does not exist" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097589 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097644 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-run-httpd\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097674 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-scripts\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097719 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-config-data\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097733 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-log-httpd\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097806 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097831 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdp7h\" (UniqueName: \"kubernetes.io/projected/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-kube-api-access-sdp7h\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.097855 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.199634 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-config-data\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.199683 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-log-httpd\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.199735 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.199895 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdp7h\" (UniqueName: \"kubernetes.io/projected/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-kube-api-access-sdp7h\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.199927 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.199976 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.200000 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-run-httpd\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.200717 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-scripts\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.200764 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-log-httpd\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.200811 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-run-httpd\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.203507 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.204006 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-config-data\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.205118 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.209905 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-scripts\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.212597 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.215717 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.217385 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.219435 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 09:40:32 crc kubenswrapper[4956]: E0314 09:40:32.219498 4956 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="0100c485-2f3c-4cbf-ac49-566c008facb5" containerName="watcher-decision-engine" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.219933 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdp7h\" (UniqueName: \"kubernetes.io/projected/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-kube-api-access-sdp7h\") pod \"ceilometer-0\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.336969 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.760581 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:32 crc kubenswrapper[4956]: I0314 09:40:32.889556 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerStarted","Data":"eaa788bb559025dfeb6aa44a86ff5ab834101234644169bb59c82de7f8b84d81"} Mar 14 09:40:33 crc kubenswrapper[4956]: I0314 09:40:33.219997 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49ee504-2434-4b11-8b9c-baf52f4654df" path="/var/lib/kubelet/pods/e49ee504-2434-4b11-8b9c-baf52f4654df/volumes" Mar 14 09:40:33 crc kubenswrapper[4956]: I0314 09:40:33.902225 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerStarted","Data":"eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e"} Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.507024 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-659h8"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.517810 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-659h8"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.535918 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.544135 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherdb6f-account-delete-sl6nq"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.560634 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db6f-account-create-update-7bmvm"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.573141 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherdb6f-account-delete-sl6nq"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.593849 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4t9h"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.595104 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.604169 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4t9h"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.702685 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-7wqsv"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.703721 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.709121 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.729735 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-7wqsv"] Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.744142 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-operator-scripts\") pod \"watcher-db-create-b4t9h\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.744225 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwt8\" (UniqueName: \"kubernetes.io/projected/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-kube-api-access-lbwt8\") pod \"watcher-db-create-b4t9h\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.845875 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6584f5-8c92-4efb-946f-77d205b1758a-operator-scripts\") pod \"watcher-test-account-create-update-7wqsv\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.846119 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrx97\" (UniqueName: \"kubernetes.io/projected/ed6584f5-8c92-4efb-946f-77d205b1758a-kube-api-access-nrx97\") pod \"watcher-test-account-create-update-7wqsv\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.846278 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-operator-scripts\") pod \"watcher-db-create-b4t9h\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.846398 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwt8\" (UniqueName: \"kubernetes.io/projected/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-kube-api-access-lbwt8\") pod \"watcher-db-create-b4t9h\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.847103 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-operator-scripts\") pod \"watcher-db-create-b4t9h\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.871126 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwt8\" (UniqueName: \"kubernetes.io/projected/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-kube-api-access-lbwt8\") pod \"watcher-db-create-b4t9h\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.916304 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerStarted","Data":"33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc"} Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.917659 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.947995 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6584f5-8c92-4efb-946f-77d205b1758a-operator-scripts\") pod \"watcher-test-account-create-update-7wqsv\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.948662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrx97\" (UniqueName: \"kubernetes.io/projected/ed6584f5-8c92-4efb-946f-77d205b1758a-kube-api-access-nrx97\") pod \"watcher-test-account-create-update-7wqsv\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.952246 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6584f5-8c92-4efb-946f-77d205b1758a-operator-scripts\") pod \"watcher-test-account-create-update-7wqsv\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:34 crc kubenswrapper[4956]: I0314 09:40:34.969649 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrx97\" (UniqueName: \"kubernetes.io/projected/ed6584f5-8c92-4efb-946f-77d205b1758a-kube-api-access-nrx97\") pod \"watcher-test-account-create-update-7wqsv\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.022949 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.230820 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="729edbbf-6306-4988-98f5-894516eee11f" path="/var/lib/kubelet/pods/729edbbf-6306-4988-98f5-894516eee11f/volumes" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.232082 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995eaef5-e453-402f-8c9a-36653d25e6d3" path="/var/lib/kubelet/pods/995eaef5-e453-402f-8c9a-36653d25e6d3/volumes" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.233029 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e175020b-d63c-4bdb-b605-78838ddb4f3f" path="/var/lib/kubelet/pods/e175020b-d63c-4bdb-b605-78838ddb4f3f/volumes" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.391418 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4t9h"] Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.587857 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-7wqsv"] Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.610033 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.750347 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.869078 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4zs\" (UniqueName: \"kubernetes.io/projected/0100c485-2f3c-4cbf-ac49-566c008facb5-kube-api-access-cd4zs\") pod \"0100c485-2f3c-4cbf-ac49-566c008facb5\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.869160 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-custom-prometheus-ca\") pod \"0100c485-2f3c-4cbf-ac49-566c008facb5\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.869233 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-combined-ca-bundle\") pod \"0100c485-2f3c-4cbf-ac49-566c008facb5\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.869327 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0100c485-2f3c-4cbf-ac49-566c008facb5-logs\") pod \"0100c485-2f3c-4cbf-ac49-566c008facb5\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.869387 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-config-data\") pod \"0100c485-2f3c-4cbf-ac49-566c008facb5\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.869434 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-cert-memcached-mtls\") pod \"0100c485-2f3c-4cbf-ac49-566c008facb5\" (UID: \"0100c485-2f3c-4cbf-ac49-566c008facb5\") " Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.871034 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0100c485-2f3c-4cbf-ac49-566c008facb5-logs" (OuterVolumeSpecName: "logs") pod "0100c485-2f3c-4cbf-ac49-566c008facb5" (UID: "0100c485-2f3c-4cbf-ac49-566c008facb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.876159 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0100c485-2f3c-4cbf-ac49-566c008facb5-kube-api-access-cd4zs" (OuterVolumeSpecName: "kube-api-access-cd4zs") pod "0100c485-2f3c-4cbf-ac49-566c008facb5" (UID: "0100c485-2f3c-4cbf-ac49-566c008facb5"). InnerVolumeSpecName "kube-api-access-cd4zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.904247 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0100c485-2f3c-4cbf-ac49-566c008facb5" (UID: "0100c485-2f3c-4cbf-ac49-566c008facb5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.912252 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0100c485-2f3c-4cbf-ac49-566c008facb5" (UID: "0100c485-2f3c-4cbf-ac49-566c008facb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.928262 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerStarted","Data":"c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.930224 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4t9h" event={"ID":"abd728e9-6c5e-4ace-ab76-af2f7eb7e229","Type":"ContainerStarted","Data":"f7937fa0cab416169b6f01ff83e48f22f4d8b69c8b3932bdf4a026c6fb49ebf4"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.930253 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4t9h" event={"ID":"abd728e9-6c5e-4ace-ab76-af2f7eb7e229","Type":"ContainerStarted","Data":"1830c4705a1d0abe2c5869b2c3d85b5bd18ef5d90575912f1b9bb007ff812181"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.932982 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-config-data" (OuterVolumeSpecName: "config-data") pod "0100c485-2f3c-4cbf-ac49-566c008facb5" (UID: "0100c485-2f3c-4cbf-ac49-566c008facb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.934798 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" event={"ID":"ed6584f5-8c92-4efb-946f-77d205b1758a","Type":"ContainerStarted","Data":"10471dc0fc95c2ba4feb5a8f549497f00af5e7ff0dc32e67305663bce8757d5e"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.934843 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" event={"ID":"ed6584f5-8c92-4efb-946f-77d205b1758a","Type":"ContainerStarted","Data":"ee96ae218441e4aa7e39a53f6c81e0d5a106f7fc218dd09a745912d96991bd08"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.936915 4956 generic.go:334] "Generic (PLEG): container finished" podID="0100c485-2f3c-4cbf-ac49-566c008facb5" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" exitCode=0 Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.936957 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0100c485-2f3c-4cbf-ac49-566c008facb5","Type":"ContainerDied","Data":"3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.936981 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0100c485-2f3c-4cbf-ac49-566c008facb5","Type":"ContainerDied","Data":"24af4f10a901927dcee441232a1bf07970c2def6608a41b9e70db10dce1b4557"} Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.936998 4956 scope.go:117] "RemoveContainer" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.937042 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.955699 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "0100c485-2f3c-4cbf-ac49-566c008facb5" (UID: "0100c485-2f3c-4cbf-ac49-566c008facb5"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.962255 4956 scope.go:117] "RemoveContainer" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" Mar 14 09:40:35 crc kubenswrapper[4956]: E0314 09:40:35.962956 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3\": container with ID starting with 3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3 not found: ID does not exist" containerID="3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.962988 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3"} err="failed to get container status \"3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3\": rpc error: code = NotFound desc = could not find container \"3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3\": container with ID starting with 3eb4fc8b9002dab5466ead4ab21e8a1e37b3ac2d7c657c650ec59aac6b8212f3 not found: ID does not exist" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.963505 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-b4t9h" podStartSLOduration=1.963468918 podStartE2EDuration="1.963468918s" podCreationTimestamp="2026-03-14 09:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:35.945501044 +0000 UTC m=+2641.458193312" watchObservedRunningTime="2026-03-14 09:40:35.963468918 +0000 UTC m=+2641.476161186" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.966759 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" podStartSLOduration=1.966746871 podStartE2EDuration="1.966746871s" podCreationTimestamp="2026-03-14 09:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:35.965297605 +0000 UTC m=+2641.477989873" watchObservedRunningTime="2026-03-14 09:40:35.966746871 +0000 UTC m=+2641.479439139" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.971464 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd4zs\" (UniqueName: \"kubernetes.io/projected/0100c485-2f3c-4cbf-ac49-566c008facb5-kube-api-access-cd4zs\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.971517 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.971527 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.971540 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0100c485-2f3c-4cbf-ac49-566c008facb5-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.971549 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:35 crc kubenswrapper[4956]: I0314 09:40:35.971559 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0100c485-2f3c-4cbf-ac49-566c008facb5-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.334081 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.340770 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.949708 4956 generic.go:334] "Generic (PLEG): container finished" podID="abd728e9-6c5e-4ace-ab76-af2f7eb7e229" containerID="f7937fa0cab416169b6f01ff83e48f22f4d8b69c8b3932bdf4a026c6fb49ebf4" exitCode=0 Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.949853 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4t9h" event={"ID":"abd728e9-6c5e-4ace-ab76-af2f7eb7e229","Type":"ContainerDied","Data":"f7937fa0cab416169b6f01ff83e48f22f4d8b69c8b3932bdf4a026c6fb49ebf4"} Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.952412 4956 generic.go:334] "Generic (PLEG): container finished" podID="ed6584f5-8c92-4efb-946f-77d205b1758a" containerID="10471dc0fc95c2ba4feb5a8f549497f00af5e7ff0dc32e67305663bce8757d5e" exitCode=0 Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.952471 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" event={"ID":"ed6584f5-8c92-4efb-946f-77d205b1758a","Type":"ContainerDied","Data":"10471dc0fc95c2ba4feb5a8f549497f00af5e7ff0dc32e67305663bce8757d5e"} Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.957608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerStarted","Data":"0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745"} Mar 14 09:40:36 crc kubenswrapper[4956]: I0314 09:40:36.958776 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:40:37 crc kubenswrapper[4956]: I0314 09:40:37.007806 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.221296691 podStartE2EDuration="6.00778187s" podCreationTimestamp="2026-03-14 09:40:31 +0000 UTC" firstStartedPulling="2026-03-14 09:40:32.765955877 +0000 UTC m=+2638.278648135" lastFinishedPulling="2026-03-14 09:40:36.552441046 +0000 UTC m=+2642.065133314" observedRunningTime="2026-03-14 09:40:36.998445434 +0000 UTC m=+2642.511137702" watchObservedRunningTime="2026-03-14 09:40:37.00778187 +0000 UTC m=+2642.520474138" Mar 14 09:40:37 crc kubenswrapper[4956]: I0314 09:40:37.226577 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0100c485-2f3c-4cbf-ac49-566c008facb5" path="/var/lib/kubelet/pods/0100c485-2f3c-4cbf-ac49-566c008facb5/volumes" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.519530 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.531346 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.618906 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbwt8\" (UniqueName: \"kubernetes.io/projected/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-kube-api-access-lbwt8\") pod \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.619315 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-operator-scripts\") pod \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\" (UID: \"abd728e9-6c5e-4ace-ab76-af2f7eb7e229\") " Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.619343 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6584f5-8c92-4efb-946f-77d205b1758a-operator-scripts\") pod \"ed6584f5-8c92-4efb-946f-77d205b1758a\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.619373 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrx97\" (UniqueName: \"kubernetes.io/projected/ed6584f5-8c92-4efb-946f-77d205b1758a-kube-api-access-nrx97\") pod \"ed6584f5-8c92-4efb-946f-77d205b1758a\" (UID: \"ed6584f5-8c92-4efb-946f-77d205b1758a\") " Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.620135 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed6584f5-8c92-4efb-946f-77d205b1758a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed6584f5-8c92-4efb-946f-77d205b1758a" (UID: "ed6584f5-8c92-4efb-946f-77d205b1758a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.620703 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abd728e9-6c5e-4ace-ab76-af2f7eb7e229" (UID: "abd728e9-6c5e-4ace-ab76-af2f7eb7e229"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.624614 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-kube-api-access-lbwt8" (OuterVolumeSpecName: "kube-api-access-lbwt8") pod "abd728e9-6c5e-4ace-ab76-af2f7eb7e229" (UID: "abd728e9-6c5e-4ace-ab76-af2f7eb7e229"). InnerVolumeSpecName "kube-api-access-lbwt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.624750 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6584f5-8c92-4efb-946f-77d205b1758a-kube-api-access-nrx97" (OuterVolumeSpecName: "kube-api-access-nrx97") pod "ed6584f5-8c92-4efb-946f-77d205b1758a" (UID: "ed6584f5-8c92-4efb-946f-77d205b1758a"). InnerVolumeSpecName "kube-api-access-nrx97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.721460 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbwt8\" (UniqueName: \"kubernetes.io/projected/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-kube-api-access-lbwt8\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.721500 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abd728e9-6c5e-4ace-ab76-af2f7eb7e229-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.721510 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed6584f5-8c92-4efb-946f-77d205b1758a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:38 crc kubenswrapper[4956]: I0314 09:40:38.721523 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrx97\" (UniqueName: \"kubernetes.io/projected/ed6584f5-8c92-4efb-946f-77d205b1758a-kube-api-access-nrx97\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:39 crc kubenswrapper[4956]: I0314 09:40:39.005660 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" Mar 14 09:40:39 crc kubenswrapper[4956]: I0314 09:40:39.005831 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-update-7wqsv" event={"ID":"ed6584f5-8c92-4efb-946f-77d205b1758a","Type":"ContainerDied","Data":"ee96ae218441e4aa7e39a53f6c81e0d5a106f7fc218dd09a745912d96991bd08"} Mar 14 09:40:39 crc kubenswrapper[4956]: I0314 09:40:39.005883 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee96ae218441e4aa7e39a53f6c81e0d5a106f7fc218dd09a745912d96991bd08" Mar 14 09:40:39 crc kubenswrapper[4956]: I0314 09:40:39.008405 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4t9h" Mar 14 09:40:39 crc kubenswrapper[4956]: I0314 09:40:39.008405 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4t9h" event={"ID":"abd728e9-6c5e-4ace-ab76-af2f7eb7e229","Type":"ContainerDied","Data":"1830c4705a1d0abe2c5869b2c3d85b5bd18ef5d90575912f1b9bb007ff812181"} Mar 14 09:40:39 crc kubenswrapper[4956]: I0314 09:40:39.008516 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1830c4705a1d0abe2c5869b2c3d85b5bd18ef5d90575912f1b9bb007ff812181" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.055235 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vglwm"] Mar 14 09:40:40 crc kubenswrapper[4956]: E0314 09:40:40.055990 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6584f5-8c92-4efb-946f-77d205b1758a" containerName="mariadb-account-create-update" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056011 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6584f5-8c92-4efb-946f-77d205b1758a" containerName="mariadb-account-create-update" Mar 14 09:40:40 crc kubenswrapper[4956]: E0314 09:40:40.056026 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd728e9-6c5e-4ace-ab76-af2f7eb7e229" containerName="mariadb-database-create" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056034 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd728e9-6c5e-4ace-ab76-af2f7eb7e229" containerName="mariadb-database-create" Mar 14 09:40:40 crc kubenswrapper[4956]: E0314 09:40:40.056063 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0100c485-2f3c-4cbf-ac49-566c008facb5" containerName="watcher-decision-engine" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056074 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0100c485-2f3c-4cbf-ac49-566c008facb5" containerName="watcher-decision-engine" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056246 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0100c485-2f3c-4cbf-ac49-566c008facb5" containerName="watcher-decision-engine" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056277 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6584f5-8c92-4efb-946f-77d205b1758a" containerName="mariadb-account-create-update" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056288 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd728e9-6c5e-4ace-ab76-af2f7eb7e229" containerName="mariadb-database-create" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.056978 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.059169 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.060060 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sb2gk" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.075106 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vglwm"] Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.143244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-config-data\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.143328 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.143366 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.143387 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c554\" (UniqueName: \"kubernetes.io/projected/ffc1286b-2939-4f02-8adf-3b92bff6e904-kube-api-access-9c554\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.245473 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.245559 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.245591 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c554\" (UniqueName: \"kubernetes.io/projected/ffc1286b-2939-4f02-8adf-3b92bff6e904-kube-api-access-9c554\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.245706 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-config-data\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.251116 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.252196 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.257182 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-config-data\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.262547 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c554\" (UniqueName: \"kubernetes.io/projected/ffc1286b-2939-4f02-8adf-3b92bff6e904-kube-api-access-9c554\") pod \"watcher-kuttl-db-sync-vglwm\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.378284 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:40 crc kubenswrapper[4956]: W0314 09:40:40.859589 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffc1286b_2939_4f02_8adf_3b92bff6e904.slice/crio-9695d19921ce584375f9f4d42b800275362b3620564e014fde912fa3a37cb44e WatchSource:0}: Error finding container 9695d19921ce584375f9f4d42b800275362b3620564e014fde912fa3a37cb44e: Status 404 returned error can't find the container with id 9695d19921ce584375f9f4d42b800275362b3620564e014fde912fa3a37cb44e Mar 14 09:40:40 crc kubenswrapper[4956]: I0314 09:40:40.863414 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vglwm"] Mar 14 09:40:41 crc kubenswrapper[4956]: I0314 09:40:41.026844 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" event={"ID":"ffc1286b-2939-4f02-8adf-3b92bff6e904","Type":"ContainerStarted","Data":"9695d19921ce584375f9f4d42b800275362b3620564e014fde912fa3a37cb44e"} Mar 14 09:40:42 crc kubenswrapper[4956]: I0314 09:40:42.038068 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" event={"ID":"ffc1286b-2939-4f02-8adf-3b92bff6e904","Type":"ContainerStarted","Data":"eab81e6bc17e803ff5dad9fae895f899217da9b8d2374071891cc93dd312a247"} Mar 14 09:40:42 crc kubenswrapper[4956]: I0314 09:40:42.060530 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" podStartSLOduration=2.060505329 podStartE2EDuration="2.060505329s" podCreationTimestamp="2026-03-14 09:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:42.05579805 +0000 UTC m=+2647.568490328" watchObservedRunningTime="2026-03-14 09:40:42.060505329 +0000 UTC m=+2647.573197597" Mar 14 09:40:44 crc kubenswrapper[4956]: I0314 09:40:44.056893 4956 generic.go:334] "Generic (PLEG): container finished" podID="ffc1286b-2939-4f02-8adf-3b92bff6e904" containerID="eab81e6bc17e803ff5dad9fae895f899217da9b8d2374071891cc93dd312a247" exitCode=0 Mar 14 09:40:44 crc kubenswrapper[4956]: I0314 09:40:44.057007 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" event={"ID":"ffc1286b-2939-4f02-8adf-3b92bff6e904","Type":"ContainerDied","Data":"eab81e6bc17e803ff5dad9fae895f899217da9b8d2374071891cc93dd312a247"} Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.431124 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.551086 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c554\" (UniqueName: \"kubernetes.io/projected/ffc1286b-2939-4f02-8adf-3b92bff6e904-kube-api-access-9c554\") pod \"ffc1286b-2939-4f02-8adf-3b92bff6e904\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.551151 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-combined-ca-bundle\") pod \"ffc1286b-2939-4f02-8adf-3b92bff6e904\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.551326 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-config-data\") pod \"ffc1286b-2939-4f02-8adf-3b92bff6e904\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.551419 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-db-sync-config-data\") pod \"ffc1286b-2939-4f02-8adf-3b92bff6e904\" (UID: \"ffc1286b-2939-4f02-8adf-3b92bff6e904\") " Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.556717 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc1286b-2939-4f02-8adf-3b92bff6e904-kube-api-access-9c554" (OuterVolumeSpecName: "kube-api-access-9c554") pod "ffc1286b-2939-4f02-8adf-3b92bff6e904" (UID: "ffc1286b-2939-4f02-8adf-3b92bff6e904"). InnerVolumeSpecName "kube-api-access-9c554". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.566601 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ffc1286b-2939-4f02-8adf-3b92bff6e904" (UID: "ffc1286b-2939-4f02-8adf-3b92bff6e904"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.588204 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffc1286b-2939-4f02-8adf-3b92bff6e904" (UID: "ffc1286b-2939-4f02-8adf-3b92bff6e904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.590870 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-config-data" (OuterVolumeSpecName: "config-data") pod "ffc1286b-2939-4f02-8adf-3b92bff6e904" (UID: "ffc1286b-2939-4f02-8adf-3b92bff6e904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.653742 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.653997 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.654065 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c554\" (UniqueName: \"kubernetes.io/projected/ffc1286b-2939-4f02-8adf-3b92bff6e904-kube-api-access-9c554\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:45 crc kubenswrapper[4956]: I0314 09:40:45.654130 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc1286b-2939-4f02-8adf-3b92bff6e904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.076899 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" event={"ID":"ffc1286b-2939-4f02-8adf-3b92bff6e904","Type":"ContainerDied","Data":"9695d19921ce584375f9f4d42b800275362b3620564e014fde912fa3a37cb44e"} Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.076942 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9695d19921ce584375f9f4d42b800275362b3620564e014fde912fa3a37cb44e" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.076956 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vglwm" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.367612 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:46 crc kubenswrapper[4956]: E0314 09:40:46.370447 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc1286b-2939-4f02-8adf-3b92bff6e904" containerName="watcher-kuttl-db-sync" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.370546 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc1286b-2939-4f02-8adf-3b92bff6e904" containerName="watcher-kuttl-db-sync" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.372724 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc1286b-2939-4f02-8adf-3b92bff6e904" containerName="watcher-kuttl-db-sync" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.375956 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.379343 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sb2gk" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.380229 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.410620 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.412532 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.425595 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.437559 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.447189 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.467541 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468470 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12cdd14a-3970-4b25-b970-66dab7d0050b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468511 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2846\" (UniqueName: \"kubernetes.io/projected/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-kube-api-access-k2846\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468654 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-logs\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468818 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468856 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468895 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4v74\" (UniqueName: \"kubernetes.io/projected/12cdd14a-3970-4b25-b970-66dab7d0050b-kube-api-access-l4v74\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.468969 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.469014 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.503404 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.513430 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.549320 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.550532 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.556854 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570618 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-logs\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570675 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570721 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12cdd14a-3970-4b25-b970-66dab7d0050b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570740 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570757 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570776 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2846\" (UniqueName: \"kubernetes.io/projected/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-kube-api-access-k2846\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570797 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570818 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570835 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-logs\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570854 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570907 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570932 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.570969 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.571002 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4v74\" (UniqueName: \"kubernetes.io/projected/12cdd14a-3970-4b25-b970-66dab7d0050b-kube-api-access-l4v74\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.571028 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw87j\" (UniqueName: \"kubernetes.io/projected/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-kube-api-access-bw87j\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.571057 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.571085 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.572496 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-logs\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.576338 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.576590 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.576967 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12cdd14a-3970-4b25-b970-66dab7d0050b-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.578091 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.581936 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.586161 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.586199 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.586987 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.594045 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.594814 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2846\" (UniqueName: \"kubernetes.io/projected/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-kube-api-access-k2846\") pod \"watcher-kuttl-api-0\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.597538 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4v74\" (UniqueName: \"kubernetes.io/projected/12cdd14a-3970-4b25-b970-66dab7d0050b-kube-api-access-l4v74\") pod \"watcher-kuttl-applier-0\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672397 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672761 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672787 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672813 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672843 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672901 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw87j\" (UniqueName: \"kubernetes.io/projected/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-kube-api-access-bw87j\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672930 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.672978 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-logs\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.673007 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.673024 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.673039 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkchr\" (UniqueName: \"kubernetes.io/projected/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-kube-api-access-hkchr\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.674011 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-logs\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.676605 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.677201 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.677572 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.679652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.690686 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw87j\" (UniqueName: \"kubernetes.io/projected/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-kube-api-access-bw87j\") pod \"watcher-kuttl-api-1\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:46 crc kubenswrapper[4956]: I0314 09:40:46.705791 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.187053 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.189841 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.190077 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.190222 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.190514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.190550 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkchr\" (UniqueName: \"kubernetes.io/projected/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-kube-api-access-hkchr\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.190686 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.190740 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.191663 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.196273 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.199954 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.198008 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.205114 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.221736 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkchr\" (UniqueName: \"kubernetes.io/projected/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-kube-api-access-hkchr\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.253160 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.644144 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.785891 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:40:47 crc kubenswrapper[4956]: W0314 09:40:47.789832 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12cdd14a_3970_4b25_b970_66dab7d0050b.slice/crio-a980d5d5a1bf532ba86abb3a29ea24a8f6eeee598bbc854c2817d4ed1c4141c1 WatchSource:0}: Error finding container a980d5d5a1bf532ba86abb3a29ea24a8f6eeee598bbc854c2817d4ed1c4141c1: Status 404 returned error can't find the container with id a980d5d5a1bf532ba86abb3a29ea24a8f6eeee598bbc854c2817d4ed1c4141c1 Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.795319 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:40:47 crc kubenswrapper[4956]: I0314 09:40:47.915261 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:40:47 crc kubenswrapper[4956]: W0314 09:40:47.921011 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc90bac0_c6b8_43f8_8c5d_75a2431c7f5f.slice/crio-f06c2be1698da07bbf533225d275abca225f0a8134d7f0f1d912595cad47e5bd WatchSource:0}: Error finding container f06c2be1698da07bbf533225d275abca225f0a8134d7f0f1d912595cad47e5bd: Status 404 returned error can't find the container with id f06c2be1698da07bbf533225d275abca225f0a8134d7f0f1d912595cad47e5bd Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.222932 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d","Type":"ContainerStarted","Data":"712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.222991 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d","Type":"ContainerStarted","Data":"01f7a768ab602e771e3087583227f87a6430617ad7ce663608d0ae48cc166d74"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.231028 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"12cdd14a-3970-4b25-b970-66dab7d0050b","Type":"ContainerStarted","Data":"d9c877aee3d5833f57fe3736e31a429a1c470815a098ac1084bacece56a7238a"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.231084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"12cdd14a-3970-4b25-b970-66dab7d0050b","Type":"ContainerStarted","Data":"a980d5d5a1bf532ba86abb3a29ea24a8f6eeee598bbc854c2817d4ed1c4141c1"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.240627 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f","Type":"ContainerStarted","Data":"f06c2be1698da07bbf533225d275abca225f0a8134d7f0f1d912595cad47e5bd"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.246432 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"376e2618-0b14-4c8f-86f8-f4b705e2dcaf","Type":"ContainerStarted","Data":"4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.246492 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"376e2618-0b14-4c8f-86f8-f4b705e2dcaf","Type":"ContainerStarted","Data":"f37e2ec5ec20154b834387e075acb6924e1952606a9b1b2ef7e2ba45b8716969"} Mar 14 09:40:48 crc kubenswrapper[4956]: I0314 09:40:48.277346 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.277319304 podStartE2EDuration="2.277319304s" podCreationTimestamp="2026-03-14 09:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:48.26806206 +0000 UTC m=+2653.780754348" watchObservedRunningTime="2026-03-14 09:40:48.277319304 +0000 UTC m=+2653.790011572" Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.259282 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d","Type":"ContainerStarted","Data":"c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f"} Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.259734 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.273434 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f","Type":"ContainerStarted","Data":"b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6"} Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.278812 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=3.278796952 podStartE2EDuration="3.278796952s" podCreationTimestamp="2026-03-14 09:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:49.277992612 +0000 UTC m=+2654.790684880" watchObservedRunningTime="2026-03-14 09:40:49.278796952 +0000 UTC m=+2654.791489220" Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.280978 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"376e2618-0b14-4c8f-86f8-f4b705e2dcaf","Type":"ContainerStarted","Data":"0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8"} Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.281375 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.304018 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=3.30399413 podStartE2EDuration="3.30399413s" podCreationTimestamp="2026-03-14 09:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:49.297992428 +0000 UTC m=+2654.810684706" watchObservedRunningTime="2026-03-14 09:40:49.30399413 +0000 UTC m=+2654.816686398" Mar 14 09:40:49 crc kubenswrapper[4956]: I0314 09:40:49.335153 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.335128948 podStartE2EDuration="3.335128948s" podCreationTimestamp="2026-03-14 09:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:40:49.325736651 +0000 UTC m=+2654.838428919" watchObservedRunningTime="2026-03-14 09:40:49.335128948 +0000 UTC m=+2654.847821216" Mar 14 09:40:51 crc kubenswrapper[4956]: I0314 09:40:51.497600 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:51 crc kubenswrapper[4956]: I0314 09:40:51.674546 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:51 crc kubenswrapper[4956]: I0314 09:40:51.707488 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:52 crc kubenswrapper[4956]: I0314 09:40:52.190400 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:52 crc kubenswrapper[4956]: I0314 09:40:52.190450 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:56 crc kubenswrapper[4956]: I0314 09:40:56.421148 4956 scope.go:117] "RemoveContainer" containerID="853facb268109455932b741d6f99159af45f2ed18a4d785e29d67356b80c9788" Mar 14 09:40:56 crc kubenswrapper[4956]: I0314 09:40:56.707226 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:56 crc kubenswrapper[4956]: I0314 09:40:56.714417 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.190376 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.190446 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.194621 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.219762 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.254375 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.278453 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.346329 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.350842 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.353293 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.374073 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:40:57 crc kubenswrapper[4956]: I0314 09:40:57.388189 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:40:59 crc kubenswrapper[4956]: I0314 09:40:59.577096 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:40:59 crc kubenswrapper[4956]: I0314 09:40:59.577735 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="sg-core" containerID="cri-o://c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56" gracePeriod=30 Mar 14 09:40:59 crc kubenswrapper[4956]: I0314 09:40:59.577783 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="proxy-httpd" containerID="cri-o://0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745" gracePeriod=30 Mar 14 09:40:59 crc kubenswrapper[4956]: I0314 09:40:59.577843 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-notification-agent" containerID="cri-o://33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc" gracePeriod=30 Mar 14 09:40:59 crc kubenswrapper[4956]: I0314 09:40:59.577993 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-central-agent" containerID="cri-o://eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e" gracePeriod=30 Mar 14 09:40:59 crc kubenswrapper[4956]: I0314 09:40:59.592758 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.224:3000/\": EOF" Mar 14 09:40:59 crc kubenswrapper[4956]: E0314 09:40:59.811722 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f144f71_ba2b_4b74_8a4d_8bfa683d65f9.slice/crio-conmon-0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.133920 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5"] Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.138852 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.141066 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.141379 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-scripts" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.147459 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5"] Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.240437 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-config-data\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.240615 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.240643 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-scripts-volume\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.240676 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5jw\" (UniqueName: \"kubernetes.io/projected/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-kube-api-access-ql5jw\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.341644 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-scripts-volume\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.341724 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5jw\" (UniqueName: \"kubernetes.io/projected/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-kube-api-access-ql5jw\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.341837 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-config-data\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.341927 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.347443 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-scripts-volume\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.347811 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.348543 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-config-data\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.360070 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5jw\" (UniqueName: \"kubernetes.io/projected/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-kube-api-access-ql5jw\") pod \"watcher-kuttl-db-purge-29558021-nnhl5\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.401621 4956 generic.go:334] "Generic (PLEG): container finished" podID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerID="0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745" exitCode=0 Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.401660 4956 generic.go:334] "Generic (PLEG): container finished" podID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerID="c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56" exitCode=2 Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.401671 4956 generic.go:334] "Generic (PLEG): container finished" podID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerID="eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e" exitCode=0 Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.401694 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerDied","Data":"0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745"} Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.401722 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerDied","Data":"c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56"} Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.401731 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerDied","Data":"eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e"} Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.459026 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:00 crc kubenswrapper[4956]: I0314 09:41:00.890781 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5"] Mar 14 09:41:00 crc kubenswrapper[4956]: W0314 09:41:00.898867 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d4b395e_a9a4_4bf8_b8e9_335c51bd6433.slice/crio-15b4b44b90e834e6295351135b2acf3660a943e84aeddd0cd9a3c3863cdffd03 WatchSource:0}: Error finding container 15b4b44b90e834e6295351135b2acf3660a943e84aeddd0cd9a3c3863cdffd03: Status 404 returned error can't find the container with id 15b4b44b90e834e6295351135b2acf3660a943e84aeddd0cd9a3c3863cdffd03 Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.399824 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.414796 4956 generic.go:334] "Generic (PLEG): container finished" podID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerID="33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc" exitCode=0 Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.414840 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerDied","Data":"33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc"} Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.414906 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9","Type":"ContainerDied","Data":"eaa788bb559025dfeb6aa44a86ff5ab834101234644169bb59c82de7f8b84d81"} Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.414935 4956 scope.go:117] "RemoveContainer" containerID="0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.415221 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.417011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" event={"ID":"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433","Type":"ContainerStarted","Data":"a5d699beb83b3a31c0f680edc711efcb80d5075d8032ae393c1cfaf597c0ed52"} Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.417054 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" event={"ID":"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433","Type":"ContainerStarted","Data":"15b4b44b90e834e6295351135b2acf3660a943e84aeddd0cd9a3c3863cdffd03"} Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.436941 4956 scope.go:117] "RemoveContainer" containerID="c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.460355 4956 scope.go:117] "RemoveContainer" containerID="33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.480450 4956 scope.go:117] "RemoveContainer" containerID="eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.499651 4956 scope.go:117] "RemoveContainer" containerID="0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.500086 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745\": container with ID starting with 0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745 not found: ID does not exist" containerID="0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.500143 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745"} err="failed to get container status \"0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745\": rpc error: code = NotFound desc = could not find container \"0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745\": container with ID starting with 0eebd3fefef86f16e9db8478b98c92cce482f8010ee4f57144c7b1eec920d745 not found: ID does not exist" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.500170 4956 scope.go:117] "RemoveContainer" containerID="c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.500409 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56\": container with ID starting with c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56 not found: ID does not exist" containerID="c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.500437 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56"} err="failed to get container status \"c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56\": rpc error: code = NotFound desc = could not find container \"c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56\": container with ID starting with c4f13f8c640076d9b7f8102b4035aa5f3fb114a169255aefde1cfccd154abb56 not found: ID does not exist" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.500472 4956 scope.go:117] "RemoveContainer" containerID="33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.501005 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc\": container with ID starting with 33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc not found: ID does not exist" containerID="33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.501051 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc"} err="failed to get container status \"33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc\": rpc error: code = NotFound desc = could not find container \"33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc\": container with ID starting with 33b63dea98d1d23d4a7cf0e3d3f8a503b6d4b74f857b126b38febdfe40cf8bfc not found: ID does not exist" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.501072 4956 scope.go:117] "RemoveContainer" containerID="eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.501973 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e\": container with ID starting with eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e not found: ID does not exist" containerID="eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.502007 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e"} err="failed to get container status \"eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e\": rpc error: code = NotFound desc = could not find container \"eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e\": container with ID starting with eb67e334b6c617fb9de0c6acaa252073ec9a494a44f59a20b4fe18eb912be13e not found: ID does not exist" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.561825 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-combined-ca-bundle\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.561999 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-scripts\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562041 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdp7h\" (UniqueName: \"kubernetes.io/projected/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-kube-api-access-sdp7h\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-log-httpd\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562232 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-sg-core-conf-yaml\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562308 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-run-httpd\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562368 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-config-data\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562402 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-ceilometer-tls-certs\") pod \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\" (UID: \"8f144f71-ba2b-4b74-8a4d-8bfa683d65f9\") " Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562688 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.562807 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.563275 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.563307 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.567439 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-scripts" (OuterVolumeSpecName: "scripts") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.567686 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-kube-api-access-sdp7h" (OuterVolumeSpecName: "kube-api-access-sdp7h") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "kube-api-access-sdp7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.595395 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.610610 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.645201 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.660849 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-config-data" (OuterVolumeSpecName: "config-data") pod "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" (UID: "8f144f71-ba2b-4b74-8a4d-8bfa683d65f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.665977 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.666176 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.666390 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdp7h\" (UniqueName: \"kubernetes.io/projected/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-kube-api-access-sdp7h\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.667303 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.667376 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.667434 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.774594 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" podStartSLOduration=1.7745666610000002 podStartE2EDuration="1.774566661s" podCreationTimestamp="2026-03-14 09:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:01.448853517 +0000 UTC m=+2666.961545785" watchObservedRunningTime="2026-03-14 09:41:01.774566661 +0000 UTC m=+2667.287258929" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.776790 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.784855 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.798733 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.799134 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-central-agent" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799158 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-central-agent" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.799174 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-notification-agent" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799182 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-notification-agent" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.799209 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="sg-core" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799220 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="sg-core" Mar 14 09:41:01 crc kubenswrapper[4956]: E0314 09:41:01.799235 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="proxy-httpd" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799243 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="proxy-httpd" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799434 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="sg-core" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799455 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-notification-agent" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799476 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="ceilometer-central-agent" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.799488 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" containerName="proxy-httpd" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.801306 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.803656 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.803930 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.804932 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.813413 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973294 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-config-data\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973346 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973372 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-run-httpd\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973391 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mjgk\" (UniqueName: \"kubernetes.io/projected/6eaff9ac-67ca-495c-a4d8-03ce876856ed-kube-api-access-2mjgk\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973435 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-log-httpd\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973458 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973825 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-scripts\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:01 crc kubenswrapper[4956]: I0314 09:41:01.973953 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-config-data\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075477 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075561 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-run-httpd\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mjgk\" (UniqueName: \"kubernetes.io/projected/6eaff9ac-67ca-495c-a4d8-03ce876856ed-kube-api-access-2mjgk\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075688 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-log-httpd\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075756 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075870 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-scripts\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.075943 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.076079 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-log-httpd\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.076108 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-run-httpd\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.079386 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-scripts\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.080341 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.081310 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.085356 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.090878 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-config-data\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.093445 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mjgk\" (UniqueName: \"kubernetes.io/projected/6eaff9ac-67ca-495c-a4d8-03ce876856ed-kube-api-access-2mjgk\") pod \"ceilometer-0\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.121600 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:02 crc kubenswrapper[4956]: W0314 09:41:02.606615 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eaff9ac_67ca_495c_a4d8_03ce876856ed.slice/crio-90fa49883a8a57daed7a9237cabb41543b4cee5da3e11c3a6b479156fc54d3dc WatchSource:0}: Error finding container 90fa49883a8a57daed7a9237cabb41543b4cee5da3e11c3a6b479156fc54d3dc: Status 404 returned error can't find the container with id 90fa49883a8a57daed7a9237cabb41543b4cee5da3e11c3a6b479156fc54d3dc Mar 14 09:41:02 crc kubenswrapper[4956]: I0314 09:41:02.608730 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:03 crc kubenswrapper[4956]: I0314 09:41:03.220965 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f144f71-ba2b-4b74-8a4d-8bfa683d65f9" path="/var/lib/kubelet/pods/8f144f71-ba2b-4b74-8a4d-8bfa683d65f9/volumes" Mar 14 09:41:03 crc kubenswrapper[4956]: I0314 09:41:03.442992 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerStarted","Data":"90fa49883a8a57daed7a9237cabb41543b4cee5da3e11c3a6b479156fc54d3dc"} Mar 14 09:41:04 crc kubenswrapper[4956]: I0314 09:41:04.454020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerStarted","Data":"e6133a8f86c7e6bfadc4e86fb5e38195b2acdd44f7374385b7b241d160bf020a"} Mar 14 09:41:04 crc kubenswrapper[4956]: I0314 09:41:04.454112 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerStarted","Data":"a327ac938c07e99326f06e798c75f0ac754072e20eefb9ce30f69c037fd1ec29"} Mar 14 09:41:04 crc kubenswrapper[4956]: I0314 09:41:04.455424 4956 generic.go:334] "Generic (PLEG): container finished" podID="4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" containerID="a5d699beb83b3a31c0f680edc711efcb80d5075d8032ae393c1cfaf597c0ed52" exitCode=0 Mar 14 09:41:04 crc kubenswrapper[4956]: I0314 09:41:04.455464 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" event={"ID":"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433","Type":"ContainerDied","Data":"a5d699beb83b3a31c0f680edc711efcb80d5075d8032ae393c1cfaf597c0ed52"} Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.466971 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerStarted","Data":"ef04048ec1be961d4b15f3b566a0b202a2049bc5d1bdc94e17adc2f51feaa59f"} Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.852023 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.955352 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-scripts-volume\") pod \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.955461 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-config-data\") pod \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.955544 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5jw\" (UniqueName: \"kubernetes.io/projected/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-kube-api-access-ql5jw\") pod \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.955899 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-combined-ca-bundle\") pod \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\" (UID: \"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433\") " Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.960402 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-scripts-volume" (OuterVolumeSpecName: "scripts-volume") pod "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" (UID: "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433"). InnerVolumeSpecName "scripts-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:05 crc kubenswrapper[4956]: I0314 09:41:05.962938 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-kube-api-access-ql5jw" (OuterVolumeSpecName: "kube-api-access-ql5jw") pod "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" (UID: "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433"). InnerVolumeSpecName "kube-api-access-ql5jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:05.989588 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" (UID: "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.002689 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-config-data" (OuterVolumeSpecName: "config-data") pod "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" (UID: "4d4b395e-a9a4-4bf8-b8e9-335c51bd6433"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.058014 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.058052 4956 reconciler_common.go:293] "Volume detached for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-scripts-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.058061 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.058071 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5jw\" (UniqueName: \"kubernetes.io/projected/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433-kube-api-access-ql5jw\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.477254 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" event={"ID":"4d4b395e-a9a4-4bf8-b8e9-335c51bd6433","Type":"ContainerDied","Data":"15b4b44b90e834e6295351135b2acf3660a943e84aeddd0cd9a3c3863cdffd03"} Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.477293 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15b4b44b90e834e6295351135b2acf3660a943e84aeddd0cd9a3c3863cdffd03" Mar 14 09:41:06 crc kubenswrapper[4956]: I0314 09:41:06.477298 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.312740 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vglwm"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.320312 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vglwm"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.342476 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.350724 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-lmfq8"] Mar 14 09:41:08 crc kubenswrapper[4956]: E0314 09:41:08.352222 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" containerName="watcher-db-manage" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.352255 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" containerName="watcher-db-manage" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.352462 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" containerName="watcher-db-manage" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.353059 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.361090 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29558021-nnhl5"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.368577 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-lmfq8"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.394926 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.395159 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="12cdd14a-3970-4b25-b970-66dab7d0050b" containerName="watcher-applier" containerID="cri-o://d9c877aee3d5833f57fe3736e31a429a1c470815a098ac1084bacece56a7238a" gracePeriod=30 Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.457239 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.457586 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" containerName="watcher-decision-engine" containerID="cri-o://b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6" gracePeriod=30 Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.501020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerStarted","Data":"411a4e24048879fc52596588dcb5ed23ac71f98e8d04c96eaaebb326d023b18b"} Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.501181 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-operator-scripts\") pod \"watchertest-account-delete-lmfq8\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.501195 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.502357 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk4tr\" (UniqueName: \"kubernetes.io/projected/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-kube-api-access-gk4tr\") pod \"watchertest-account-delete-lmfq8\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.600134 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.600628 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-kuttl-api-log" containerID="cri-o://4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516" gracePeriod=30 Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.601110 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-api" containerID="cri-o://0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8" gracePeriod=30 Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.605120 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk4tr\" (UniqueName: \"kubernetes.io/projected/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-kube-api-access-gk4tr\") pod \"watchertest-account-delete-lmfq8\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.606430 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-operator-scripts\") pod \"watchertest-account-delete-lmfq8\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.608006 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-operator-scripts\") pod \"watchertest-account-delete-lmfq8\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.620566 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.620819 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-kuttl-api-log" containerID="cri-o://712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3" gracePeriod=30 Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.621264 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-api" containerID="cri-o://c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f" gracePeriod=30 Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.624689 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=3.127929096 podStartE2EDuration="7.624664933s" podCreationTimestamp="2026-03-14 09:41:01 +0000 UTC" firstStartedPulling="2026-03-14 09:41:02.610583901 +0000 UTC m=+2668.123276169" lastFinishedPulling="2026-03-14 09:41:07.107319738 +0000 UTC m=+2672.620012006" observedRunningTime="2026-03-14 09:41:08.541016056 +0000 UTC m=+2674.053708324" watchObservedRunningTime="2026-03-14 09:41:08.624664933 +0000 UTC m=+2674.137357201" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.633407 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk4tr\" (UniqueName: \"kubernetes.io/projected/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-kube-api-access-gk4tr\") pod \"watchertest-account-delete-lmfq8\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:08 crc kubenswrapper[4956]: I0314 09:41:08.669136 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.230892 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4b395e-a9a4-4bf8-b8e9-335c51bd6433" path="/var/lib/kubelet/pods/4d4b395e-a9a4-4bf8-b8e9-335c51bd6433/volumes" Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.232416 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc1286b-2939-4f02-8adf-3b92bff6e904" path="/var/lib/kubelet/pods/ffc1286b-2939-4f02-8adf-3b92bff6e904/volumes" Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.363794 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-lmfq8"] Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.540002 4956 generic.go:334] "Generic (PLEG): container finished" podID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerID="712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3" exitCode=143 Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.540056 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d","Type":"ContainerDied","Data":"712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3"} Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.542622 4956 generic.go:334] "Generic (PLEG): container finished" podID="12cdd14a-3970-4b25-b970-66dab7d0050b" containerID="d9c877aee3d5833f57fe3736e31a429a1c470815a098ac1084bacece56a7238a" exitCode=0 Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.542669 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"12cdd14a-3970-4b25-b970-66dab7d0050b","Type":"ContainerDied","Data":"d9c877aee3d5833f57fe3736e31a429a1c470815a098ac1084bacece56a7238a"} Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.544222 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" event={"ID":"3c6a5698-e006-4a0f-94ad-6b32044d1fbe","Type":"ContainerStarted","Data":"422631ed04eceb6f6f3a7fb16ea35a83ddaa0635951317d1d28b08ad8625f3d0"} Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.549325 4956 generic.go:334] "Generic (PLEG): container finished" podID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerID="4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516" exitCode=143 Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.549422 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"376e2618-0b14-4c8f-86f8-f4b705e2dcaf","Type":"ContainerDied","Data":"4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516"} Mar 14 09:41:09 crc kubenswrapper[4956]: I0314 09:41:09.968174 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.141272 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12cdd14a-3970-4b25-b970-66dab7d0050b-logs\") pod \"12cdd14a-3970-4b25-b970-66dab7d0050b\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.141470 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-combined-ca-bundle\") pod \"12cdd14a-3970-4b25-b970-66dab7d0050b\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.141521 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4v74\" (UniqueName: \"kubernetes.io/projected/12cdd14a-3970-4b25-b970-66dab7d0050b-kube-api-access-l4v74\") pod \"12cdd14a-3970-4b25-b970-66dab7d0050b\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.141603 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-cert-memcached-mtls\") pod \"12cdd14a-3970-4b25-b970-66dab7d0050b\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.141664 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-config-data\") pod \"12cdd14a-3970-4b25-b970-66dab7d0050b\" (UID: \"12cdd14a-3970-4b25-b970-66dab7d0050b\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.145976 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12cdd14a-3970-4b25-b970-66dab7d0050b-logs" (OuterVolumeSpecName: "logs") pod "12cdd14a-3970-4b25-b970-66dab7d0050b" (UID: "12cdd14a-3970-4b25-b970-66dab7d0050b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.168363 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cdd14a-3970-4b25-b970-66dab7d0050b-kube-api-access-l4v74" (OuterVolumeSpecName: "kube-api-access-l4v74") pod "12cdd14a-3970-4b25-b970-66dab7d0050b" (UID: "12cdd14a-3970-4b25-b970-66dab7d0050b"). InnerVolumeSpecName "kube-api-access-l4v74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.214654 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-config-data" (OuterVolumeSpecName: "config-data") pod "12cdd14a-3970-4b25-b970-66dab7d0050b" (UID: "12cdd14a-3970-4b25-b970-66dab7d0050b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.243547 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.243952 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12cdd14a-3970-4b25-b970-66dab7d0050b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.243967 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4v74\" (UniqueName: \"kubernetes.io/projected/12cdd14a-3970-4b25-b970-66dab7d0050b-kube-api-access-l4v74\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.270036 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.277646 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12cdd14a-3970-4b25-b970-66dab7d0050b" (UID: "12cdd14a-3970-4b25-b970-66dab7d0050b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.306791 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "12cdd14a-3970-4b25-b970-66dab7d0050b" (UID: "12cdd14a-3970-4b25-b970-66dab7d0050b"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.345614 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.345657 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/12cdd14a-3970-4b25-b970-66dab7d0050b-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.453132 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-combined-ca-bundle\") pod \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.453354 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-logs\") pod \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.453389 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-config-data\") pod \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.453438 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-cert-memcached-mtls\") pod \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.453472 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-custom-prometheus-ca\") pod \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.453535 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2846\" (UniqueName: \"kubernetes.io/projected/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-kube-api-access-k2846\") pod \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\" (UID: \"376e2618-0b14-4c8f-86f8-f4b705e2dcaf\") " Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.461922 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-logs" (OuterVolumeSpecName: "logs") pod "376e2618-0b14-4c8f-86f8-f4b705e2dcaf" (UID: "376e2618-0b14-4c8f-86f8-f4b705e2dcaf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.475836 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-kube-api-access-k2846" (OuterVolumeSpecName: "kube-api-access-k2846") pod "376e2618-0b14-4c8f-86f8-f4b705e2dcaf" (UID: "376e2618-0b14-4c8f-86f8-f4b705e2dcaf"). InnerVolumeSpecName "kube-api-access-k2846". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.524050 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "376e2618-0b14-4c8f-86f8-f4b705e2dcaf" (UID: "376e2618-0b14-4c8f-86f8-f4b705e2dcaf"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.528143 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-config-data" (OuterVolumeSpecName: "config-data") pod "376e2618-0b14-4c8f-86f8-f4b705e2dcaf" (UID: "376e2618-0b14-4c8f-86f8-f4b705e2dcaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.528660 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "376e2618-0b14-4c8f-86f8-f4b705e2dcaf" (UID: "376e2618-0b14-4c8f-86f8-f4b705e2dcaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.555548 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.555574 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2846\" (UniqueName: \"kubernetes.io/projected/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-kube-api-access-k2846\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.555585 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.555596 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.555605 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.570344 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"12cdd14a-3970-4b25-b970-66dab7d0050b","Type":"ContainerDied","Data":"a980d5d5a1bf532ba86abb3a29ea24a8f6eeee598bbc854c2817d4ed1c4141c1"} Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.570412 4956 scope.go:117] "RemoveContainer" containerID="d9c877aee3d5833f57fe3736e31a429a1c470815a098ac1084bacece56a7238a" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.570597 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.580066 4956 generic.go:334] "Generic (PLEG): container finished" podID="3c6a5698-e006-4a0f-94ad-6b32044d1fbe" containerID="5c670ffe272006dd521ff52eb0d057dc2460b566a0c6187610c6e00b4e6010bb" exitCode=0 Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.580350 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" event={"ID":"3c6a5698-e006-4a0f-94ad-6b32044d1fbe","Type":"ContainerDied","Data":"5c670ffe272006dd521ff52eb0d057dc2460b566a0c6187610c6e00b4e6010bb"} Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.584509 4956 generic.go:334] "Generic (PLEG): container finished" podID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerID="0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8" exitCode=0 Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.584688 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"376e2618-0b14-4c8f-86f8-f4b705e2dcaf","Type":"ContainerDied","Data":"0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8"} Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.584781 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"376e2618-0b14-4c8f-86f8-f4b705e2dcaf","Type":"ContainerDied","Data":"f37e2ec5ec20154b834387e075acb6924e1952606a9b1b2ef7e2ba45b8716969"} Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.584930 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.590110 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "376e2618-0b14-4c8f-86f8-f4b705e2dcaf" (UID: "376e2618-0b14-4c8f-86f8-f4b705e2dcaf"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.629304 4956 scope.go:117] "RemoveContainer" containerID="0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.636661 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.646983 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.657742 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/376e2618-0b14-4c8f-86f8-f4b705e2dcaf-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:10 crc kubenswrapper[4956]: I0314 09:41:10.777413 4956 scope.go:117] "RemoveContainer" containerID="4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:10.799540 4956 scope.go:117] "RemoveContainer" containerID="0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8" Mar 14 09:41:11 crc kubenswrapper[4956]: E0314 09:41:10.800031 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8\": container with ID starting with 0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8 not found: ID does not exist" containerID="0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:10.800058 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8"} err="failed to get container status \"0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8\": rpc error: code = NotFound desc = could not find container \"0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8\": container with ID starting with 0328604b6a7bfae527e32c22087e73d586c884791c3993d2dc7585b26a065ca8 not found: ID does not exist" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:10.800078 4956 scope.go:117] "RemoveContainer" containerID="4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516" Mar 14 09:41:11 crc kubenswrapper[4956]: E0314 09:41:10.800318 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516\": container with ID starting with 4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516 not found: ID does not exist" containerID="4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:10.800334 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516"} err="failed to get container status \"4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516\": rpc error: code = NotFound desc = could not find container \"4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516\": container with ID starting with 4c236f34c474734fa5d3cfc606fe8cb0ea990ce7e24e39c2f72db76f07a24516 not found: ID does not exist" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:10.945162 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:10.954721 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.073415 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.166975 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-cert-memcached-mtls\") pod \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.167730 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw87j\" (UniqueName: \"kubernetes.io/projected/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-kube-api-access-bw87j\") pod \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.167869 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-combined-ca-bundle\") pod \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.168330 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-custom-prometheus-ca\") pod \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.168376 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-logs\") pod \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.168463 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-config-data\") pod \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\" (UID: \"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d\") " Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.169431 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-logs" (OuterVolumeSpecName: "logs") pod "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" (UID: "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.193642 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-kube-api-access-bw87j" (OuterVolumeSpecName: "kube-api-access-bw87j") pod "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" (UID: "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d"). InnerVolumeSpecName "kube-api-access-bw87j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.200627 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" (UID: "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.215682 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-config-data" (OuterVolumeSpecName: "config-data") pod "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" (UID: "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.220727 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" (UID: "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.231909 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cdd14a-3970-4b25-b970-66dab7d0050b" path="/var/lib/kubelet/pods/12cdd14a-3970-4b25-b970-66dab7d0050b/volumes" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.233099 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" path="/var/lib/kubelet/pods/376e2618-0b14-4c8f-86f8-f4b705e2dcaf/volumes" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.257969 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" (UID: "39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.270739 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.270779 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.270794 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.270805 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.270819 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw87j\" (UniqueName: \"kubernetes.io/projected/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-kube-api-access-bw87j\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.270833 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.596907 4956 generic.go:334] "Generic (PLEG): container finished" podID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerID="c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f" exitCode=0 Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.596999 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.597094 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d","Type":"ContainerDied","Data":"c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f"} Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.597197 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d","Type":"ContainerDied","Data":"01f7a768ab602e771e3087583227f87a6430617ad7ce663608d0ae48cc166d74"} Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.597226 4956 scope.go:117] "RemoveContainer" containerID="c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f" Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.640737 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:41:11 crc kubenswrapper[4956]: I0314 09:41:11.653006 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.002861 4956 scope.go:117] "RemoveContainer" containerID="712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3" Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.028694 4956 scope.go:117] "RemoveContainer" containerID="c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f" Mar 14 09:41:12 crc kubenswrapper[4956]: E0314 09:41:12.029261 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f\": container with ID starting with c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f not found: ID does not exist" containerID="c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f" Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.029310 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f"} err="failed to get container status \"c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f\": rpc error: code = NotFound desc = could not find container \"c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f\": container with ID starting with c236450a0216af061e8189fb3d31150718d9c88f0b76afbaed54a0e6dcefac8f not found: ID does not exist" Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.029339 4956 scope.go:117] "RemoveContainer" containerID="712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3" Mar 14 09:41:12 crc kubenswrapper[4956]: E0314 09:41:12.030058 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3\": container with ID starting with 712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3 not found: ID does not exist" containerID="712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3" Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.030094 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3"} err="failed to get container status \"712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3\": rpc error: code = NotFound desc = could not find container \"712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3\": container with ID starting with 712ce829858f8b73eb7e7d2e4ccb7a05574106d1a188aed2d82823e03f3d01a3 not found: ID does not exist" Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.501286 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.501608 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-central-agent" containerID="cri-o://a327ac938c07e99326f06e798c75f0ac754072e20eefb9ce30f69c037fd1ec29" gracePeriod=30 Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.502071 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="proxy-httpd" containerID="cri-o://411a4e24048879fc52596588dcb5ed23ac71f98e8d04c96eaaebb326d023b18b" gracePeriod=30 Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.502144 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="sg-core" containerID="cri-o://ef04048ec1be961d4b15f3b566a0b202a2049bc5d1bdc94e17adc2f51feaa59f" gracePeriod=30 Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.502185 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-notification-agent" containerID="cri-o://e6133a8f86c7e6bfadc4e86fb5e38195b2acdd44f7374385b7b241d160bf020a" gracePeriod=30 Mar 14 09:41:12 crc kubenswrapper[4956]: I0314 09:41:12.878214 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.008673 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk4tr\" (UniqueName: \"kubernetes.io/projected/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-kube-api-access-gk4tr\") pod \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.008884 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-operator-scripts\") pod \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\" (UID: \"3c6a5698-e006-4a0f-94ad-6b32044d1fbe\") " Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.009558 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c6a5698-e006-4a0f-94ad-6b32044d1fbe" (UID: "3c6a5698-e006-4a0f-94ad-6b32044d1fbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.010571 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.015597 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-kube-api-access-gk4tr" (OuterVolumeSpecName: "kube-api-access-gk4tr") pod "3c6a5698-e006-4a0f-94ad-6b32044d1fbe" (UID: "3c6a5698-e006-4a0f-94ad-6b32044d1fbe"). InnerVolumeSpecName "kube-api-access-gk4tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.112470 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk4tr\" (UniqueName: \"kubernetes.io/projected/3c6a5698-e006-4a0f-94ad-6b32044d1fbe-kube-api-access-gk4tr\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.222944 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" path="/var/lib/kubelet/pods/39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d/volumes" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814179 4956 generic.go:334] "Generic (PLEG): container finished" podID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerID="411a4e24048879fc52596588dcb5ed23ac71f98e8d04c96eaaebb326d023b18b" exitCode=0 Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814714 4956 generic.go:334] "Generic (PLEG): container finished" podID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerID="ef04048ec1be961d4b15f3b566a0b202a2049bc5d1bdc94e17adc2f51feaa59f" exitCode=2 Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814733 4956 generic.go:334] "Generic (PLEG): container finished" podID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerID="e6133a8f86c7e6bfadc4e86fb5e38195b2acdd44f7374385b7b241d160bf020a" exitCode=0 Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814745 4956 generic.go:334] "Generic (PLEG): container finished" podID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerID="a327ac938c07e99326f06e798c75f0ac754072e20eefb9ce30f69c037fd1ec29" exitCode=0 Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814238 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerDied","Data":"411a4e24048879fc52596588dcb5ed23ac71f98e8d04c96eaaebb326d023b18b"} Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814847 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerDied","Data":"ef04048ec1be961d4b15f3b566a0b202a2049bc5d1bdc94e17adc2f51feaa59f"} Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814861 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerDied","Data":"e6133a8f86c7e6bfadc4e86fb5e38195b2acdd44f7374385b7b241d160bf020a"} Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.814873 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerDied","Data":"a327ac938c07e99326f06e798c75f0ac754072e20eefb9ce30f69c037fd1ec29"} Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.817195 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" event={"ID":"3c6a5698-e006-4a0f-94ad-6b32044d1fbe","Type":"ContainerDied","Data":"422631ed04eceb6f6f3a7fb16ea35a83ddaa0635951317d1d28b08ad8625f3d0"} Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.817245 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422631ed04eceb6f6f3a7fb16ea35a83ddaa0635951317d1d28b08ad8625f3d0" Mar 14 09:41:13 crc kubenswrapper[4956]: I0314 09:41:13.817268 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-lmfq8" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.402194 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.536825 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-ceilometer-tls-certs\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.536881 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-sg-core-conf-yaml\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.536911 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-combined-ca-bundle\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537027 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-log-httpd\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537124 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mjgk\" (UniqueName: \"kubernetes.io/projected/6eaff9ac-67ca-495c-a4d8-03ce876856ed-kube-api-access-2mjgk\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-run-httpd\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537198 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-config-data\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537270 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-scripts\") pod \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\" (UID: \"6eaff9ac-67ca-495c-a4d8-03ce876856ed\") " Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537722 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.537749 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.538193 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.538216 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6eaff9ac-67ca-495c-a4d8-03ce876856ed-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.549707 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-scripts" (OuterVolumeSpecName: "scripts") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.556778 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaff9ac-67ca-495c-a4d8-03ce876856ed-kube-api-access-2mjgk" (OuterVolumeSpecName: "kube-api-access-2mjgk") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "kube-api-access-2mjgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.612724 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.641556 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.641596 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mjgk\" (UniqueName: \"kubernetes.io/projected/6eaff9ac-67ca-495c-a4d8-03ce876856ed-kube-api-access-2mjgk\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.641606 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.643618 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.694845 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.716295 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-config-data" (OuterVolumeSpecName: "config-data") pod "6eaff9ac-67ca-495c-a4d8-03ce876856ed" (UID: "6eaff9ac-67ca-495c-a4d8-03ce876856ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.743148 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.743349 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.743407 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eaff9ac-67ca-495c-a4d8-03ce876856ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.831608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6eaff9ac-67ca-495c-a4d8-03ce876856ed","Type":"ContainerDied","Data":"90fa49883a8a57daed7a9237cabb41543b4cee5da3e11c3a6b479156fc54d3dc"} Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.831668 4956 scope.go:117] "RemoveContainer" containerID="411a4e24048879fc52596588dcb5ed23ac71f98e8d04c96eaaebb326d023b18b" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.831686 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.853102 4956 scope.go:117] "RemoveContainer" containerID="ef04048ec1be961d4b15f3b566a0b202a2049bc5d1bdc94e17adc2f51feaa59f" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.867468 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.876825 4956 scope.go:117] "RemoveContainer" containerID="e6133a8f86c7e6bfadc4e86fb5e38195b2acdd44f7374385b7b241d160bf020a" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.878705 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.899030 4956 scope.go:117] "RemoveContainer" containerID="a327ac938c07e99326f06e798c75f0ac754072e20eefb9ce30f69c037fd1ec29" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.914632 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915030 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-notification-agent" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915048 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-notification-agent" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915076 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-kuttl-api-log" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915084 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-kuttl-api-log" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915095 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cdd14a-3970-4b25-b970-66dab7d0050b" containerName="watcher-applier" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915101 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cdd14a-3970-4b25-b970-66dab7d0050b" containerName="watcher-applier" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915112 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-kuttl-api-log" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915118 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-kuttl-api-log" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915128 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-api" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915134 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-api" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915149 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6a5698-e006-4a0f-94ad-6b32044d1fbe" containerName="mariadb-account-delete" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915156 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6a5698-e006-4a0f-94ad-6b32044d1fbe" containerName="mariadb-account-delete" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915166 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-central-agent" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915173 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-central-agent" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915188 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="proxy-httpd" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915194 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="proxy-httpd" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915207 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-api" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915213 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-api" Mar 14 09:41:14 crc kubenswrapper[4956]: E0314 09:41:14.915219 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="sg-core" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915225 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="sg-core" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915364 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="proxy-httpd" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915379 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-notification-agent" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915389 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-kuttl-api-log" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915397 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cdd14a-3970-4b25-b970-66dab7d0050b" containerName="watcher-applier" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915404 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a9e191-bf9f-4ef7-af6e-96c9b8b6ca8d" containerName="watcher-api" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915414 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-api" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915423 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="sg-core" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915431 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="376e2618-0b14-4c8f-86f8-f4b705e2dcaf" containerName="watcher-kuttl-api-log" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915440 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" containerName="ceilometer-central-agent" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.915448 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6a5698-e006-4a0f-94ad-6b32044d1fbe" containerName="mariadb-account-delete" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.916931 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.920075 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.920683 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.925397 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:41:14 crc kubenswrapper[4956]: I0314 09:41:14.942739 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.050752 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.050815 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.050904 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-config-data\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.050955 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.050985 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-scripts\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.051238 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-log-httpd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.051397 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-run-httpd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.051455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9pmd\" (UniqueName: \"kubernetes.io/projected/47aa75ef-abe9-4750-a802-39776852fdd3-kube-api-access-c9pmd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153576 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-run-httpd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153630 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9pmd\" (UniqueName: \"kubernetes.io/projected/47aa75ef-abe9-4750-a802-39776852fdd3-kube-api-access-c9pmd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153664 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153683 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153727 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-config-data\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153743 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153785 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-scripts\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.153846 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-log-httpd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.154207 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-run-httpd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.154240 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-log-httpd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.158136 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.158176 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.158148 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.158215 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-scripts\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.161696 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-config-data\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.172057 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9pmd\" (UniqueName: \"kubernetes.io/projected/47aa75ef-abe9-4750-a802-39776852fdd3-kube-api-access-c9pmd\") pod \"ceilometer-0\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.219802 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eaff9ac-67ca-495c-a4d8-03ce876856ed" path="/var/lib/kubelet/pods/6eaff9ac-67ca-495c-a4d8-03ce876856ed/volumes" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.236825 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.701273 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.810381 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.845021 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerStarted","Data":"3d33f1b526a7c5e6ae192716944c51b6076b93efcf15041a072b64572c740665"} Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.848361 4956 generic.go:334] "Generic (PLEG): container finished" podID="fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" containerID="b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6" exitCode=0 Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.848422 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.848435 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f","Type":"ContainerDied","Data":"b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6"} Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.849581 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f","Type":"ContainerDied","Data":"f06c2be1698da07bbf533225d275abca225f0a8134d7f0f1d912595cad47e5bd"} Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.849645 4956 scope.go:117] "RemoveContainer" containerID="b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.867625 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkchr\" (UniqueName: \"kubernetes.io/projected/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-kube-api-access-hkchr\") pod \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.867888 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-custom-prometheus-ca\") pod \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.868693 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-combined-ca-bundle\") pod \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.868799 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-logs\") pod \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.868946 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-config-data\") pod \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.869034 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-cert-memcached-mtls\") pod \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\" (UID: \"fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f\") " Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.869971 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-logs" (OuterVolumeSpecName: "logs") pod "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" (UID: "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.872766 4956 scope.go:117] "RemoveContainer" containerID="b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6" Mar 14 09:41:15 crc kubenswrapper[4956]: E0314 09:41:15.874367 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6\": container with ID starting with b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6 not found: ID does not exist" containerID="b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.874509 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6"} err="failed to get container status \"b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6\": rpc error: code = NotFound desc = could not find container \"b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6\": container with ID starting with b95f63a4971077944b7e3b9be9930c41da3443cf00979f9a470190305d8c06c6 not found: ID does not exist" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.890149 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-kube-api-access-hkchr" (OuterVolumeSpecName: "kube-api-access-hkchr") pod "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" (UID: "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f"). InnerVolumeSpecName "kube-api-access-hkchr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.894710 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" (UID: "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.901683 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" (UID: "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.914009 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-config-data" (OuterVolumeSpecName: "config-data") pod "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" (UID: "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.941266 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" (UID: "fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.971555 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.971794 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.971864 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkchr\" (UniqueName: \"kubernetes.io/projected/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-kube-api-access-hkchr\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.971924 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.972041 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:15 crc kubenswrapper[4956]: I0314 09:41:15.972103 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:16 crc kubenswrapper[4956]: I0314 09:41:16.178729 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:41:16 crc kubenswrapper[4956]: I0314 09:41:16.185543 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:41:16 crc kubenswrapper[4956]: I0314 09:41:16.857238 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerStarted","Data":"4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f"} Mar 14 09:41:17 crc kubenswrapper[4956]: I0314 09:41:17.222927 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" path="/var/lib/kubelet/pods/fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f/volumes" Mar 14 09:41:17 crc kubenswrapper[4956]: I0314 09:41:17.869290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerStarted","Data":"40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0"} Mar 14 09:41:17 crc kubenswrapper[4956]: I0314 09:41:17.869550 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerStarted","Data":"7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca"} Mar 14 09:41:18 crc kubenswrapper[4956]: I0314 09:41:18.398932 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4t9h"] Mar 14 09:41:18 crc kubenswrapper[4956]: I0314 09:41:18.405601 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4t9h"] Mar 14 09:41:18 crc kubenswrapper[4956]: I0314 09:41:18.413866 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-lmfq8"] Mar 14 09:41:18 crc kubenswrapper[4956]: I0314 09:41:18.420223 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-7wqsv"] Mar 14 09:41:18 crc kubenswrapper[4956]: I0314 09:41:18.427148 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-lmfq8"] Mar 14 09:41:18 crc kubenswrapper[4956]: I0314 09:41:18.434028 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-update-7wqsv"] Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.114303 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-j8fn2"] Mar 14 09:41:19 crc kubenswrapper[4956]: E0314 09:41:19.115114 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" containerName="watcher-decision-engine" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.115131 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" containerName="watcher-decision-engine" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.115345 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc90bac0-c6b8-43f8-8c5d-75a2431c7f5f" containerName="watcher-decision-engine" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.116091 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.124109 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb"] Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.125675 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.129895 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.134776 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-j8fn2"] Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.148476 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb"] Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.226923 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6a5698-e006-4a0f-94ad-6b32044d1fbe" path="/var/lib/kubelet/pods/3c6a5698-e006-4a0f-94ad-6b32044d1fbe/volumes" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.227464 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abd728e9-6c5e-4ace-ab76-af2f7eb7e229" path="/var/lib/kubelet/pods/abd728e9-6c5e-4ace-ab76-af2f7eb7e229/volumes" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.227935 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6584f5-8c92-4efb-946f-77d205b1758a" path="/var/lib/kubelet/pods/ed6584f5-8c92-4efb-946f-77d205b1758a/volumes" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.233643 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-operator-scripts\") pod \"watcher-2dda-account-create-update-mn8kb\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.233718 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xs5\" (UniqueName: \"kubernetes.io/projected/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-kube-api-access-t7xs5\") pod \"watcher-2dda-account-create-update-mn8kb\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.233768 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrzhl\" (UniqueName: \"kubernetes.io/projected/65b2bab5-aa6d-4da2-a689-8908125bebff-kube-api-access-hrzhl\") pod \"watcher-db-create-j8fn2\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.233797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b2bab5-aa6d-4da2-a689-8908125bebff-operator-scripts\") pod \"watcher-db-create-j8fn2\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.335352 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b2bab5-aa6d-4da2-a689-8908125bebff-operator-scripts\") pod \"watcher-db-create-j8fn2\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.335463 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-operator-scripts\") pod \"watcher-2dda-account-create-update-mn8kb\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.336131 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xs5\" (UniqueName: \"kubernetes.io/projected/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-kube-api-access-t7xs5\") pod \"watcher-2dda-account-create-update-mn8kb\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.336189 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrzhl\" (UniqueName: \"kubernetes.io/projected/65b2bab5-aa6d-4da2-a689-8908125bebff-kube-api-access-hrzhl\") pod \"watcher-db-create-j8fn2\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.336403 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b2bab5-aa6d-4da2-a689-8908125bebff-operator-scripts\") pod \"watcher-db-create-j8fn2\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.337322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-operator-scripts\") pod \"watcher-2dda-account-create-update-mn8kb\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.352556 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrzhl\" (UniqueName: \"kubernetes.io/projected/65b2bab5-aa6d-4da2-a689-8908125bebff-kube-api-access-hrzhl\") pod \"watcher-db-create-j8fn2\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.359122 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xs5\" (UniqueName: \"kubernetes.io/projected/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-kube-api-access-t7xs5\") pod \"watcher-2dda-account-create-update-mn8kb\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.451033 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.464835 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.908886 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerStarted","Data":"77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8"} Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.910294 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.915428 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb"] Mar 14 09:41:19 crc kubenswrapper[4956]: W0314 09:41:19.921087 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd16fe9b4_f1f7_416c_a9c7_2996cddc6a29.slice/crio-53435360ce79f1b7cdf922fea7a6bb77619d05db9a947c56c0f2423ce7f01dcb WatchSource:0}: Error finding container 53435360ce79f1b7cdf922fea7a6bb77619d05db9a947c56c0f2423ce7f01dcb: Status 404 returned error can't find the container with id 53435360ce79f1b7cdf922fea7a6bb77619d05db9a947c56c0f2423ce7f01dcb Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.945987 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.4451668570000002 podStartE2EDuration="5.945954584s" podCreationTimestamp="2026-03-14 09:41:14 +0000 UTC" firstStartedPulling="2026-03-14 09:41:15.822783574 +0000 UTC m=+2681.335475842" lastFinishedPulling="2026-03-14 09:41:19.323571301 +0000 UTC m=+2684.836263569" observedRunningTime="2026-03-14 09:41:19.942746833 +0000 UTC m=+2685.455439121" watchObservedRunningTime="2026-03-14 09:41:19.945954584 +0000 UTC m=+2685.458646842" Mar 14 09:41:19 crc kubenswrapper[4956]: W0314 09:41:19.987718 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b2bab5_aa6d_4da2_a689_8908125bebff.slice/crio-d850b94fa140da2d33ee0885d6baa6a1cd560e5321bb44ab3c44644fa038dcc7 WatchSource:0}: Error finding container d850b94fa140da2d33ee0885d6baa6a1cd560e5321bb44ab3c44644fa038dcc7: Status 404 returned error can't find the container with id d850b94fa140da2d33ee0885d6baa6a1cd560e5321bb44ab3c44644fa038dcc7 Mar 14 09:41:19 crc kubenswrapper[4956]: I0314 09:41:19.988332 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-j8fn2"] Mar 14 09:41:20 crc kubenswrapper[4956]: I0314 09:41:20.921929 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" event={"ID":"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29","Type":"ContainerStarted","Data":"a0ab92fa2aa3182014847d37e3fd318b21a5bb2f1c478fafac83ac6c21510369"} Mar 14 09:41:20 crc kubenswrapper[4956]: I0314 09:41:20.922282 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" event={"ID":"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29","Type":"ContainerStarted","Data":"53435360ce79f1b7cdf922fea7a6bb77619d05db9a947c56c0f2423ce7f01dcb"} Mar 14 09:41:20 crc kubenswrapper[4956]: I0314 09:41:20.924039 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-j8fn2" event={"ID":"65b2bab5-aa6d-4da2-a689-8908125bebff","Type":"ContainerStarted","Data":"e4536404463e2f6abe58c4069d46bcfe563f96914cb64b0fe01d5a4b2e4cba8e"} Mar 14 09:41:20 crc kubenswrapper[4956]: I0314 09:41:20.924066 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-j8fn2" event={"ID":"65b2bab5-aa6d-4da2-a689-8908125bebff","Type":"ContainerStarted","Data":"d850b94fa140da2d33ee0885d6baa6a1cd560e5321bb44ab3c44644fa038dcc7"} Mar 14 09:41:20 crc kubenswrapper[4956]: I0314 09:41:20.940678 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" podStartSLOduration=1.940659201 podStartE2EDuration="1.940659201s" podCreationTimestamp="2026-03-14 09:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:20.937335607 +0000 UTC m=+2686.450027885" watchObservedRunningTime="2026-03-14 09:41:20.940659201 +0000 UTC m=+2686.453351469" Mar 14 09:41:20 crc kubenswrapper[4956]: I0314 09:41:20.955647 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-db-create-j8fn2" podStartSLOduration=1.95562539 podStartE2EDuration="1.95562539s" podCreationTimestamp="2026-03-14 09:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:20.952615243 +0000 UTC m=+2686.465307511" watchObservedRunningTime="2026-03-14 09:41:20.95562539 +0000 UTC m=+2686.468317658" Mar 14 09:41:21 crc kubenswrapper[4956]: I0314 09:41:21.935716 4956 generic.go:334] "Generic (PLEG): container finished" podID="65b2bab5-aa6d-4da2-a689-8908125bebff" containerID="e4536404463e2f6abe58c4069d46bcfe563f96914cb64b0fe01d5a4b2e4cba8e" exitCode=0 Mar 14 09:41:21 crc kubenswrapper[4956]: I0314 09:41:21.935903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-j8fn2" event={"ID":"65b2bab5-aa6d-4da2-a689-8908125bebff","Type":"ContainerDied","Data":"e4536404463e2f6abe58c4069d46bcfe563f96914cb64b0fe01d5a4b2e4cba8e"} Mar 14 09:41:21 crc kubenswrapper[4956]: I0314 09:41:21.938257 4956 generic.go:334] "Generic (PLEG): container finished" podID="d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" containerID="a0ab92fa2aa3182014847d37e3fd318b21a5bb2f1c478fafac83ac6c21510369" exitCode=0 Mar 14 09:41:21 crc kubenswrapper[4956]: I0314 09:41:21.938370 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" event={"ID":"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29","Type":"ContainerDied","Data":"a0ab92fa2aa3182014847d37e3fd318b21a5bb2f1c478fafac83ac6c21510369"} Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.387914 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.392235 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.516604 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrzhl\" (UniqueName: \"kubernetes.io/projected/65b2bab5-aa6d-4da2-a689-8908125bebff-kube-api-access-hrzhl\") pod \"65b2bab5-aa6d-4da2-a689-8908125bebff\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.516691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7xs5\" (UniqueName: \"kubernetes.io/projected/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-kube-api-access-t7xs5\") pod \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.516721 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b2bab5-aa6d-4da2-a689-8908125bebff-operator-scripts\") pod \"65b2bab5-aa6d-4da2-a689-8908125bebff\" (UID: \"65b2bab5-aa6d-4da2-a689-8908125bebff\") " Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.516896 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-operator-scripts\") pod \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\" (UID: \"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29\") " Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.517466 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" (UID: "d16fe9b4-f1f7-416c-a9c7-2996cddc6a29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.517518 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b2bab5-aa6d-4da2-a689-8908125bebff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65b2bab5-aa6d-4da2-a689-8908125bebff" (UID: "65b2bab5-aa6d-4da2-a689-8908125bebff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.521807 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-kube-api-access-t7xs5" (OuterVolumeSpecName: "kube-api-access-t7xs5") pod "d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" (UID: "d16fe9b4-f1f7-416c-a9c7-2996cddc6a29"). InnerVolumeSpecName "kube-api-access-t7xs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.529021 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b2bab5-aa6d-4da2-a689-8908125bebff-kube-api-access-hrzhl" (OuterVolumeSpecName: "kube-api-access-hrzhl") pod "65b2bab5-aa6d-4da2-a689-8908125bebff" (UID: "65b2bab5-aa6d-4da2-a689-8908125bebff"). InnerVolumeSpecName "kube-api-access-hrzhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.619854 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.619890 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrzhl\" (UniqueName: \"kubernetes.io/projected/65b2bab5-aa6d-4da2-a689-8908125bebff-kube-api-access-hrzhl\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.619901 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7xs5\" (UniqueName: \"kubernetes.io/projected/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29-kube-api-access-t7xs5\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.619909 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65b2bab5-aa6d-4da2-a689-8908125bebff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.967414 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-j8fn2" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.967813 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-j8fn2" event={"ID":"65b2bab5-aa6d-4da2-a689-8908125bebff","Type":"ContainerDied","Data":"d850b94fa140da2d33ee0885d6baa6a1cd560e5321bb44ab3c44644fa038dcc7"} Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.967945 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d850b94fa140da2d33ee0885d6baa6a1cd560e5321bb44ab3c44644fa038dcc7" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.970261 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" event={"ID":"d16fe9b4-f1f7-416c-a9c7-2996cddc6a29","Type":"ContainerDied","Data":"53435360ce79f1b7cdf922fea7a6bb77619d05db9a947c56c0f2423ce7f01dcb"} Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.970287 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53435360ce79f1b7cdf922fea7a6bb77619d05db9a947c56c0f2423ce7f01dcb" Mar 14 09:41:23 crc kubenswrapper[4956]: I0314 09:41:23.970336 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.366393 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp"] Mar 14 09:41:29 crc kubenswrapper[4956]: E0314 09:41:29.367286 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b2bab5-aa6d-4da2-a689-8908125bebff" containerName="mariadb-database-create" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.367300 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b2bab5-aa6d-4da2-a689-8908125bebff" containerName="mariadb-database-create" Mar 14 09:41:29 crc kubenswrapper[4956]: E0314 09:41:29.367319 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" containerName="mariadb-account-create-update" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.367326 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" containerName="mariadb-account-create-update" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.367469 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b2bab5-aa6d-4da2-a689-8908125bebff" containerName="mariadb-database-create" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.367508 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" containerName="mariadb-account-create-update" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.368063 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.371632 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.371897 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-tz552" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.378799 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp"] Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.521209 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.521301 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-config-data\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.521539 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp8zq\" (UniqueName: \"kubernetes.io/projected/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-kube-api-access-hp8zq\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.521596 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-db-sync-config-data\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.623637 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp8zq\" (UniqueName: \"kubernetes.io/projected/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-kube-api-access-hp8zq\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.623688 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-db-sync-config-data\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.623755 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.623810 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-config-data\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.629036 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-db-sync-config-data\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.629637 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.629673 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-config-data\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.643049 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp8zq\" (UniqueName: \"kubernetes.io/projected/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-kube-api-access-hp8zq\") pod \"watcher-kuttl-db-sync-sjwtp\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:29 crc kubenswrapper[4956]: I0314 09:41:29.694541 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:30 crc kubenswrapper[4956]: I0314 09:41:30.140184 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp"] Mar 14 09:41:30 crc kubenswrapper[4956]: W0314 09:41:30.146435 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ab8d230_dc20_4f35_bb22_fe0bb9c3a0e6.slice/crio-fce5d6bb58eaece0a709554854fd125ab65600ef84a24de2af13ae7681aed2bd WatchSource:0}: Error finding container fce5d6bb58eaece0a709554854fd125ab65600ef84a24de2af13ae7681aed2bd: Status 404 returned error can't find the container with id fce5d6bb58eaece0a709554854fd125ab65600ef84a24de2af13ae7681aed2bd Mar 14 09:41:31 crc kubenswrapper[4956]: I0314 09:41:31.032561 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" event={"ID":"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6","Type":"ContainerStarted","Data":"fcd7cf09991007254a8a2dd7e1cf7fee84fd89807c5d9afdb30692408555c7a3"} Mar 14 09:41:31 crc kubenswrapper[4956]: I0314 09:41:31.033095 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" event={"ID":"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6","Type":"ContainerStarted","Data":"fce5d6bb58eaece0a709554854fd125ab65600ef84a24de2af13ae7681aed2bd"} Mar 14 09:41:31 crc kubenswrapper[4956]: I0314 09:41:31.053185 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" podStartSLOduration=2.053160402 podStartE2EDuration="2.053160402s" podCreationTimestamp="2026-03-14 09:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:31.046811422 +0000 UTC m=+2696.559503690" watchObservedRunningTime="2026-03-14 09:41:31.053160402 +0000 UTC m=+2696.565852670" Mar 14 09:41:33 crc kubenswrapper[4956]: I0314 09:41:33.054954 4956 generic.go:334] "Generic (PLEG): container finished" podID="0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" containerID="fcd7cf09991007254a8a2dd7e1cf7fee84fd89807c5d9afdb30692408555c7a3" exitCode=0 Mar 14 09:41:33 crc kubenswrapper[4956]: I0314 09:41:33.055020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" event={"ID":"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6","Type":"ContainerDied","Data":"fcd7cf09991007254a8a2dd7e1cf7fee84fd89807c5d9afdb30692408555c7a3"} Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.361575 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.442556 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-db-sync-config-data\") pod \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.442658 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-combined-ca-bundle\") pod \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.442899 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-config-data\") pod \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.442929 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp8zq\" (UniqueName: \"kubernetes.io/projected/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-kube-api-access-hp8zq\") pod \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\" (UID: \"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6\") " Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.451348 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" (UID: "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.451411 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-kube-api-access-hp8zq" (OuterVolumeSpecName: "kube-api-access-hp8zq") pod "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" (UID: "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6"). InnerVolumeSpecName "kube-api-access-hp8zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.465778 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" (UID: "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.498350 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-config-data" (OuterVolumeSpecName: "config-data") pod "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" (UID: "0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.544948 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.544985 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp8zq\" (UniqueName: \"kubernetes.io/projected/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-kube-api-access-hp8zq\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.544997 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:34 crc kubenswrapper[4956]: I0314 09:41:34.545009 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.076754 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" event={"ID":"0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6","Type":"ContainerDied","Data":"fce5d6bb58eaece0a709554854fd125ab65600ef84a24de2af13ae7681aed2bd"} Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.077071 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce5d6bb58eaece0a709554854fd125ab65600ef84a24de2af13ae7681aed2bd" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.076865 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.323769 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:41:35 crc kubenswrapper[4956]: E0314 09:41:35.324117 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" containerName="watcher-kuttl-db-sync" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.324131 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" containerName="watcher-kuttl-db-sync" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.324299 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" containerName="watcher-kuttl-db-sync" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.325340 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.332625 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.348473 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-tz552" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.356443 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.402244 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.403406 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.420343 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.448023 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.476973 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477015 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477064 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477086 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-logs\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477103 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477230 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477251 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477267 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477394 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmk5\" (UniqueName: \"kubernetes.io/projected/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-kube-api-access-stmk5\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477420 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06eb83ec-f4cc-4b4d-a129-cb029c188810-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.477580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82btc\" (UniqueName: \"kubernetes.io/projected/06eb83ec-f4cc-4b4d-a129-cb029c188810-kube-api-access-82btc\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.516619 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.517851 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.544046 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.561374 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579405 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579466 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579507 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82btc\" (UniqueName: \"kubernetes.io/projected/06eb83ec-f4cc-4b4d-a129-cb029c188810-kube-api-access-82btc\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579556 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579586 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579608 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579647 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr9v\" (UniqueName: \"kubernetes.io/projected/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-kube-api-access-fcr9v\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579681 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-logs\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579698 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579721 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579740 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579773 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579790 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579821 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmk5\" (UniqueName: \"kubernetes.io/projected/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-kube-api-access-stmk5\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.579843 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06eb83ec-f4cc-4b4d-a129-cb029c188810-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.580257 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06eb83ec-f4cc-4b4d-a129-cb029c188810-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.581620 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-logs\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.593257 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.594177 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.595585 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.599322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.604955 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.616961 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.619768 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.621911 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82btc\" (UniqueName: \"kubernetes.io/projected/06eb83ec-f4cc-4b4d-a129-cb029c188810-kube-api-access-82btc\") pod \"watcher-kuttl-applier-0\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.644655 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmk5\" (UniqueName: \"kubernetes.io/projected/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-kube-api-access-stmk5\") pod \"watcher-kuttl-api-0\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.682409 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr9v\" (UniqueName: \"kubernetes.io/projected/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-kube-api-access-fcr9v\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.682499 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.682564 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.682591 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.682615 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.682640 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.687444 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.694149 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.696069 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.701142 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.708516 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.716784 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.724413 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr9v\" (UniqueName: \"kubernetes.io/projected/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-kube-api-access-fcr9v\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.850137 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:35 crc kubenswrapper[4956]: I0314 09:41:35.941806 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:36 crc kubenswrapper[4956]: W0314 09:41:36.176932 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06eb83ec_f4cc_4b4d_a129_cb029c188810.slice/crio-0de3c8dc7cb59adbf114800057a183b0638cdda238607ed451eeb08da58ddf91 WatchSource:0}: Error finding container 0de3c8dc7cb59adbf114800057a183b0638cdda238607ed451eeb08da58ddf91: Status 404 returned error can't find the container with id 0de3c8dc7cb59adbf114800057a183b0638cdda238607ed451eeb08da58ddf91 Mar 14 09:41:36 crc kubenswrapper[4956]: I0314 09:41:36.178522 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:41:36 crc kubenswrapper[4956]: I0314 09:41:36.302068 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:41:36 crc kubenswrapper[4956]: I0314 09:41:36.412577 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:41:36 crc kubenswrapper[4956]: W0314 09:41:36.429021 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b8cbae_06d6_4ce8_b3c0_ab05743ebb80.slice/crio-e00a45429abb13a296fe7d50e78121c18642046878c14c76746c16ec44db501b WatchSource:0}: Error finding container e00a45429abb13a296fe7d50e78121c18642046878c14c76746c16ec44db501b: Status 404 returned error can't find the container with id e00a45429abb13a296fe7d50e78121c18642046878c14c76746c16ec44db501b Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.125895 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80","Type":"ContainerStarted","Data":"7e5cf6b205184cc09c2b7019a3e9b09c1b956a7ceaf1e9c02ce1f5e4b0c5b4ab"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.126233 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80","Type":"ContainerStarted","Data":"8cdc4ce98055306fde55b3a21b37398de8997ab5d2c0790dd8a28464255a27cb"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.126249 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80","Type":"ContainerStarted","Data":"e00a45429abb13a296fe7d50e78121c18642046878c14c76746c16ec44db501b"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.126646 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.128535 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"06eb83ec-f4cc-4b4d-a129-cb029c188810","Type":"ContainerStarted","Data":"79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.129016 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"06eb83ec-f4cc-4b4d-a129-cb029c188810","Type":"ContainerStarted","Data":"0de3c8dc7cb59adbf114800057a183b0638cdda238607ed451eeb08da58ddf91"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.131252 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51","Type":"ContainerStarted","Data":"64abe770ac3c8873baddb1131c39b9e1338f98b06403d389b78593e1c547a6be"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.131293 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51","Type":"ContainerStarted","Data":"d9ecd5124c6effd229b32e6a64b2aeef13b96d62efd829358eabc0eed444a921"} Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.150766 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.150746558 podStartE2EDuration="2.150746558s" podCreationTimestamp="2026-03-14 09:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:37.144434478 +0000 UTC m=+2702.657126746" watchObservedRunningTime="2026-03-14 09:41:37.150746558 +0000 UTC m=+2702.663438826" Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.191396 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.191375816 podStartE2EDuration="2.191375816s" podCreationTimestamp="2026-03-14 09:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:37.172660872 +0000 UTC m=+2702.685353150" watchObservedRunningTime="2026-03-14 09:41:37.191375816 +0000 UTC m=+2702.704068084" Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.192669 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.192661739 podStartE2EDuration="2.192661739s" podCreationTimestamp="2026-03-14 09:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:37.186856772 +0000 UTC m=+2702.699549060" watchObservedRunningTime="2026-03-14 09:41:37.192661739 +0000 UTC m=+2702.705354007" Mar 14 09:41:37 crc kubenswrapper[4956]: I0314 09:41:37.373009 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:38 crc kubenswrapper[4956]: I0314 09:41:38.501631 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:39 crc kubenswrapper[4956]: I0314 09:41:39.354096 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 09:41:39 crc kubenswrapper[4956]: I0314 09:41:39.519240 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:40 crc kubenswrapper[4956]: I0314 09:41:40.035449 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:40 crc kubenswrapper[4956]: I0314 09:41:40.720215 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:40 crc kubenswrapper[4956]: I0314 09:41:40.942801 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:41 crc kubenswrapper[4956]: I0314 09:41:41.198509 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:42 crc kubenswrapper[4956]: I0314 09:41:42.368143 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:43 crc kubenswrapper[4956]: I0314 09:41:43.527087 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:44 crc kubenswrapper[4956]: I0314 09:41:44.754052 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.248444 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.717749 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.741511 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.850642 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.874612 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.942602 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.946101 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:45 crc kubenswrapper[4956]: I0314 09:41:45.984393 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:46 crc kubenswrapper[4956]: I0314 09:41:46.419963 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:46 crc kubenswrapper[4956]: I0314 09:41:46.427275 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:41:46 crc kubenswrapper[4956]: I0314 09:41:46.447464 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:41:46 crc kubenswrapper[4956]: I0314 09:41:46.455236 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:41:47 crc kubenswrapper[4956]: I0314 09:41:47.155511 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:47 crc kubenswrapper[4956]: I0314 09:41:47.389270 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:47 crc kubenswrapper[4956]: I0314 09:41:47.909495 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-create-d7v5r"] Mar 14 09:41:47 crc kubenswrapper[4956]: I0314 09:41:47.911532 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:47 crc kubenswrapper[4956]: I0314 09:41:47.935069 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-d7v5r"] Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.001666 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4"] Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.002792 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.006103 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-db-secret" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.014194 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4"] Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.082174 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgms\" (UniqueName: \"kubernetes.io/projected/c28c8d67-468c-4083-ad8a-17fdfa500bff-kube-api-access-skgms\") pod \"cinder-db-create-d7v5r\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.082278 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c8d67-468c-4083-ad8a-17fdfa500bff-operator-scripts\") pod \"cinder-db-create-d7v5r\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.184027 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0162871-a7ca-4c7b-8147-884d131abcd6-operator-scripts\") pod \"cinder-46a3-account-create-update-ln9q4\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.185280 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58xk\" (UniqueName: \"kubernetes.io/projected/a0162871-a7ca-4c7b-8147-884d131abcd6-kube-api-access-w58xk\") pod \"cinder-46a3-account-create-update-ln9q4\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.185376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgms\" (UniqueName: \"kubernetes.io/projected/c28c8d67-468c-4083-ad8a-17fdfa500bff-kube-api-access-skgms\") pod \"cinder-db-create-d7v5r\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.185536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c8d67-468c-4083-ad8a-17fdfa500bff-operator-scripts\") pod \"cinder-db-create-d7v5r\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.186383 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c8d67-468c-4083-ad8a-17fdfa500bff-operator-scripts\") pod \"cinder-db-create-d7v5r\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.215748 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgms\" (UniqueName: \"kubernetes.io/projected/c28c8d67-468c-4083-ad8a-17fdfa500bff-kube-api-access-skgms\") pod \"cinder-db-create-d7v5r\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.248080 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.287199 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0162871-a7ca-4c7b-8147-884d131abcd6-operator-scripts\") pod \"cinder-46a3-account-create-update-ln9q4\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.287260 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58xk\" (UniqueName: \"kubernetes.io/projected/a0162871-a7ca-4c7b-8147-884d131abcd6-kube-api-access-w58xk\") pod \"cinder-46a3-account-create-update-ln9q4\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.288281 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0162871-a7ca-4c7b-8147-884d131abcd6-operator-scripts\") pod \"cinder-46a3-account-create-update-ln9q4\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.310175 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58xk\" (UniqueName: \"kubernetes.io/projected/a0162871-a7ca-4c7b-8147-884d131abcd6-kube-api-access-w58xk\") pod \"cinder-46a3-account-create-update-ln9q4\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.321616 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.576887 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.715384 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-d7v5r"] Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.859525 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.859938 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-central-agent" containerID="cri-o://4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f" gracePeriod=30 Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.860583 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-notification-agent" containerID="cri-o://7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca" gracePeriod=30 Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.860636 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="sg-core" containerID="cri-o://40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0" gracePeriod=30 Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.860803 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="proxy-httpd" containerID="cri-o://77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8" gracePeriod=30 Mar 14 09:41:48 crc kubenswrapper[4956]: I0314 09:41:48.892044 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4"] Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.456201 4956 generic.go:334] "Generic (PLEG): container finished" podID="47aa75ef-abe9-4750-a802-39776852fdd3" containerID="77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8" exitCode=0 Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.456456 4956 generic.go:334] "Generic (PLEG): container finished" podID="47aa75ef-abe9-4750-a802-39776852fdd3" containerID="40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0" exitCode=2 Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.456465 4956 generic.go:334] "Generic (PLEG): container finished" podID="47aa75ef-abe9-4750-a802-39776852fdd3" containerID="4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f" exitCode=0 Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.456514 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerDied","Data":"77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.456539 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerDied","Data":"40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.456548 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerDied","Data":"4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.458121 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" event={"ID":"a0162871-a7ca-4c7b-8147-884d131abcd6","Type":"ContainerStarted","Data":"262120b322fff85bf80f0b7b1393c99f22e6b06a0982bcc04f4ba407e0b1e385"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.458146 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" event={"ID":"a0162871-a7ca-4c7b-8147-884d131abcd6","Type":"ContainerStarted","Data":"33b30dd0fa4b1de17fe5681eb509c87a951a9c497da02698326653add4b4b2b5"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.460987 4956 generic.go:334] "Generic (PLEG): container finished" podID="c28c8d67-468c-4083-ad8a-17fdfa500bff" containerID="03c0b99550487f4685aa386603445d10f61ee3410b31f7b8217e2105527bfe83" exitCode=0 Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.461046 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-d7v5r" event={"ID":"c28c8d67-468c-4083-ad8a-17fdfa500bff","Type":"ContainerDied","Data":"03c0b99550487f4685aa386603445d10f61ee3410b31f7b8217e2105527bfe83"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.461079 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-d7v5r" event={"ID":"c28c8d67-468c-4083-ad8a-17fdfa500bff","Type":"ContainerStarted","Data":"b6434cebb984551500e2c275bd102248979ff8cc642225c224998b7f40d45762"} Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.485554 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" podStartSLOduration=2.485529481 podStartE2EDuration="2.485529481s" podCreationTimestamp="2026-03-14 09:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:41:49.479895889 +0000 UTC m=+2714.992588157" watchObservedRunningTime="2026-03-14 09:41:49.485529481 +0000 UTC m=+2714.998221759" Mar 14 09:41:49 crc kubenswrapper[4956]: I0314 09:41:49.752845 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.468521 4956 generic.go:334] "Generic (PLEG): container finished" podID="a0162871-a7ca-4c7b-8147-884d131abcd6" containerID="262120b322fff85bf80f0b7b1393c99f22e6b06a0982bcc04f4ba407e0b1e385" exitCode=0 Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.468621 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" event={"ID":"a0162871-a7ca-4c7b-8147-884d131abcd6","Type":"ContainerDied","Data":"262120b322fff85bf80f0b7b1393c99f22e6b06a0982bcc04f4ba407e0b1e385"} Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.817138 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.938616 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28c8d67-468c-4083-ad8a-17fdfa500bff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c28c8d67-468c-4083-ad8a-17fdfa500bff" (UID: "c28c8d67-468c-4083-ad8a-17fdfa500bff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.938692 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c8d67-468c-4083-ad8a-17fdfa500bff-operator-scripts\") pod \"c28c8d67-468c-4083-ad8a-17fdfa500bff\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.939395 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.939638 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgms\" (UniqueName: \"kubernetes.io/projected/c28c8d67-468c-4083-ad8a-17fdfa500bff-kube-api-access-skgms\") pod \"c28c8d67-468c-4083-ad8a-17fdfa500bff\" (UID: \"c28c8d67-468c-4083-ad8a-17fdfa500bff\") " Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.940085 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28c8d67-468c-4083-ad8a-17fdfa500bff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:50 crc kubenswrapper[4956]: I0314 09:41:50.947859 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28c8d67-468c-4083-ad8a-17fdfa500bff-kube-api-access-skgms" (OuterVolumeSpecName: "kube-api-access-skgms") pod "c28c8d67-468c-4083-ad8a-17fdfa500bff" (UID: "c28c8d67-468c-4083-ad8a-17fdfa500bff"). InnerVolumeSpecName "kube-api-access-skgms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.041919 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgms\" (UniqueName: \"kubernetes.io/projected/c28c8d67-468c-4083-ad8a-17fdfa500bff-kube-api-access-skgms\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.479016 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-d7v5r" Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.479080 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-d7v5r" event={"ID":"c28c8d67-468c-4083-ad8a-17fdfa500bff","Type":"ContainerDied","Data":"b6434cebb984551500e2c275bd102248979ff8cc642225c224998b7f40d45762"} Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.479751 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6434cebb984551500e2c275bd102248979ff8cc642225c224998b7f40d45762" Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.811797 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.957780 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0162871-a7ca-4c7b-8147-884d131abcd6-operator-scripts\") pod \"a0162871-a7ca-4c7b-8147-884d131abcd6\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.957879 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w58xk\" (UniqueName: \"kubernetes.io/projected/a0162871-a7ca-4c7b-8147-884d131abcd6-kube-api-access-w58xk\") pod \"a0162871-a7ca-4c7b-8147-884d131abcd6\" (UID: \"a0162871-a7ca-4c7b-8147-884d131abcd6\") " Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.958528 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0162871-a7ca-4c7b-8147-884d131abcd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0162871-a7ca-4c7b-8147-884d131abcd6" (UID: "a0162871-a7ca-4c7b-8147-884d131abcd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:41:51 crc kubenswrapper[4956]: I0314 09:41:51.962790 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0162871-a7ca-4c7b-8147-884d131abcd6-kube-api-access-w58xk" (OuterVolumeSpecName: "kube-api-access-w58xk") pod "a0162871-a7ca-4c7b-8147-884d131abcd6" (UID: "a0162871-a7ca-4c7b-8147-884d131abcd6"). InnerVolumeSpecName "kube-api-access-w58xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:52 crc kubenswrapper[4956]: I0314 09:41:52.072649 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0162871-a7ca-4c7b-8147-884d131abcd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:52 crc kubenswrapper[4956]: I0314 09:41:52.072684 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w58xk\" (UniqueName: \"kubernetes.io/projected/a0162871-a7ca-4c7b-8147-884d131abcd6-kube-api-access-w58xk\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:52 crc kubenswrapper[4956]: I0314 09:41:52.130794 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:52 crc kubenswrapper[4956]: I0314 09:41:52.491338 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" event={"ID":"a0162871-a7ca-4c7b-8147-884d131abcd6","Type":"ContainerDied","Data":"33b30dd0fa4b1de17fe5681eb509c87a951a9c497da02698326653add4b4b2b5"} Mar 14 09:41:52 crc kubenswrapper[4956]: I0314 09:41:52.491663 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b30dd0fa4b1de17fe5681eb509c87a951a9c497da02698326653add4b4b2b5" Mar 14 09:41:52 crc kubenswrapper[4956]: I0314 09:41:52.491457 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.232321 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-sync-l5tm2"] Mar 14 09:41:53 crc kubenswrapper[4956]: E0314 09:41:53.232680 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0162871-a7ca-4c7b-8147-884d131abcd6" containerName="mariadb-account-create-update" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.232701 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0162871-a7ca-4c7b-8147-884d131abcd6" containerName="mariadb-account-create-update" Mar 14 09:41:53 crc kubenswrapper[4956]: E0314 09:41:53.232734 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28c8d67-468c-4083-ad8a-17fdfa500bff" containerName="mariadb-database-create" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.232740 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28c8d67-468c-4083-ad8a-17fdfa500bff" containerName="mariadb-database-create" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.232871 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0162871-a7ca-4c7b-8147-884d131abcd6" containerName="mariadb-account-create-update" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.232891 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28c8d67-468c-4083-ad8a-17fdfa500bff" containerName="mariadb-database-create" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.233433 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.235028 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-m88cn" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.235474 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.236666 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.253126 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-l5tm2"] Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.294448 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-combined-ca-bundle\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.294570 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-db-sync-config-data\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.294630 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbcgv\" (UniqueName: \"kubernetes.io/projected/72dd9b23-3681-4a76-bc25-a2271e65d0aa-kube-api-access-gbcgv\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.294711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-config-data\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.294845 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72dd9b23-3681-4a76-bc25-a2271e65d0aa-etc-machine-id\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.294871 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-scripts\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.301994 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.396299 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-combined-ca-bundle\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.396369 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-db-sync-config-data\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.396395 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbcgv\" (UniqueName: \"kubernetes.io/projected/72dd9b23-3681-4a76-bc25-a2271e65d0aa-kube-api-access-gbcgv\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.396420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-config-data\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.396474 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72dd9b23-3681-4a76-bc25-a2271e65d0aa-etc-machine-id\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.396518 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-scripts\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.397193 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72dd9b23-3681-4a76-bc25-a2271e65d0aa-etc-machine-id\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.400698 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-db-sync-config-data\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.401858 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-config-data\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.403912 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-scripts\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.412580 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-combined-ca-bundle\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.412897 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbcgv\" (UniqueName: \"kubernetes.io/projected/72dd9b23-3681-4a76-bc25-a2271e65d0aa-kube-api-access-gbcgv\") pod \"cinder-db-sync-l5tm2\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.548860 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:41:53 crc kubenswrapper[4956]: I0314 09:41:53.996323 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-l5tm2"] Mar 14 09:41:54 crc kubenswrapper[4956]: W0314 09:41:54.001025 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72dd9b23_3681_4a76_bc25_a2271e65d0aa.slice/crio-2446793245fab3811933f925cd42daf98700de4848932fb88fe49b68bb307222 WatchSource:0}: Error finding container 2446793245fab3811933f925cd42daf98700de4848932fb88fe49b68bb307222: Status 404 returned error can't find the container with id 2446793245fab3811933f925cd42daf98700de4848932fb88fe49b68bb307222 Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.246594 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.309882 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-config-data\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310128 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-run-httpd\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310356 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-ceilometer-tls-certs\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310568 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9pmd\" (UniqueName: \"kubernetes.io/projected/47aa75ef-abe9-4750-a802-39776852fdd3-kube-api-access-c9pmd\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310705 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-scripts\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310830 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-combined-ca-bundle\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310936 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-sg-core-conf-yaml\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.311136 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-log-httpd\") pod \"47aa75ef-abe9-4750-a802-39776852fdd3\" (UID: \"47aa75ef-abe9-4750-a802-39776852fdd3\") " Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.310838 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.312920 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.320824 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-scripts" (OuterVolumeSpecName: "scripts") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.324567 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47aa75ef-abe9-4750-a802-39776852fdd3-kube-api-access-c9pmd" (OuterVolumeSpecName: "kube-api-access-c9pmd") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "kube-api-access-c9pmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.356595 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.363952 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.389931 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413027 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9pmd\" (UniqueName: \"kubernetes.io/projected/47aa75ef-abe9-4750-a802-39776852fdd3-kube-api-access-c9pmd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413074 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413083 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413091 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413099 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413108 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47aa75ef-abe9-4750-a802-39776852fdd3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.413117 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.421221 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-config-data" (OuterVolumeSpecName: "config-data") pod "47aa75ef-abe9-4750-a802-39776852fdd3" (UID: "47aa75ef-abe9-4750-a802-39776852fdd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.493462 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.510884 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" event={"ID":"72dd9b23-3681-4a76-bc25-a2271e65d0aa","Type":"ContainerStarted","Data":"2446793245fab3811933f925cd42daf98700de4848932fb88fe49b68bb307222"} Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.513931 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47aa75ef-abe9-4750-a802-39776852fdd3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.514565 4956 generic.go:334] "Generic (PLEG): container finished" podID="47aa75ef-abe9-4750-a802-39776852fdd3" containerID="7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca" exitCode=0 Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.514625 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerDied","Data":"7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca"} Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.514635 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.514671 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"47aa75ef-abe9-4750-a802-39776852fdd3","Type":"ContainerDied","Data":"3d33f1b526a7c5e6ae192716944c51b6076b93efcf15041a072b64572c740665"} Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.514703 4956 scope.go:117] "RemoveContainer" containerID="77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.549753 4956 scope.go:117] "RemoveContainer" containerID="40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.552740 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.561079 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.573883 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.574448 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-notification-agent" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574474 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-notification-agent" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.574536 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="proxy-httpd" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574546 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="proxy-httpd" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.574563 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-central-agent" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574571 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-central-agent" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.574586 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="sg-core" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574594 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="sg-core" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574796 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="sg-core" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574820 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-central-agent" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574835 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="proxy-httpd" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.574856 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" containerName="ceilometer-notification-agent" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.577699 4956 scope.go:117] "RemoveContainer" containerID="7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.580175 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.583604 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.583767 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.583892 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.591344 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615278 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbxf\" (UniqueName: \"kubernetes.io/projected/4ee019dc-4049-48be-b9a5-48bfd7d5087d-kube-api-access-5qbxf\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615335 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-scripts\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615357 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-run-httpd\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615380 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-config-data\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615395 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615466 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.615519 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-log-httpd\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.621961 4956 scope.go:117] "RemoveContainer" containerID="4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.640781 4956 scope.go:117] "RemoveContainer" containerID="77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.641314 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8\": container with ID starting with 77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8 not found: ID does not exist" containerID="77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.641355 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8"} err="failed to get container status \"77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8\": rpc error: code = NotFound desc = could not find container \"77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8\": container with ID starting with 77bf796e2308ac253d55178626033ac6197831c5593c126a7f37f0446cdadcd8 not found: ID does not exist" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.641382 4956 scope.go:117] "RemoveContainer" containerID="40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.641831 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0\": container with ID starting with 40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0 not found: ID does not exist" containerID="40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.641856 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0"} err="failed to get container status \"40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0\": rpc error: code = NotFound desc = could not find container \"40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0\": container with ID starting with 40e184c2aa6b8176c759d01407ad825c2edd13f752babbc435bb80452080acc0 not found: ID does not exist" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.641874 4956 scope.go:117] "RemoveContainer" containerID="7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.642315 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca\": container with ID starting with 7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca not found: ID does not exist" containerID="7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.642338 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca"} err="failed to get container status \"7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca\": rpc error: code = NotFound desc = could not find container \"7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca\": container with ID starting with 7f1db5fb5acc964a075a72dbc77e4a7b76dd333ef2bcc636124333ea323cb6ca not found: ID does not exist" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.642355 4956 scope.go:117] "RemoveContainer" containerID="4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f" Mar 14 09:41:54 crc kubenswrapper[4956]: E0314 09:41:54.642684 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f\": container with ID starting with 4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f not found: ID does not exist" containerID="4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.642707 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f"} err="failed to get container status \"4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f\": rpc error: code = NotFound desc = could not find container \"4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f\": container with ID starting with 4feff8fc2de81fc8f29ddec91ca120be98f23491b0e9dafae2f8e43cc6e4f36f not found: ID does not exist" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717755 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717810 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-log-httpd\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717855 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbxf\" (UniqueName: \"kubernetes.io/projected/4ee019dc-4049-48be-b9a5-48bfd7d5087d-kube-api-access-5qbxf\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717885 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-scripts\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717905 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-run-httpd\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717930 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-config-data\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717948 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.717983 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.718920 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-log-httpd\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.719332 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-run-httpd\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.722104 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.722723 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.723308 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.723683 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-config-data\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.723774 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-scripts\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.736581 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbxf\" (UniqueName: \"kubernetes.io/projected/4ee019dc-4049-48be-b9a5-48bfd7d5087d-kube-api-access-5qbxf\") pod \"ceilometer-0\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:54 crc kubenswrapper[4956]: I0314 09:41:54.910561 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:41:55 crc kubenswrapper[4956]: I0314 09:41:55.221358 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47aa75ef-abe9-4750-a802-39776852fdd3" path="/var/lib/kubelet/pods/47aa75ef-abe9-4750-a802-39776852fdd3/volumes" Mar 14 09:41:55 crc kubenswrapper[4956]: I0314 09:41:55.339545 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:41:55 crc kubenswrapper[4956]: W0314 09:41:55.341966 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ee019dc_4049_48be_b9a5_48bfd7d5087d.slice/crio-3f0d8ddd2a3a9f81bd3adafcb5415de5181db2147ba6d28cd8483b14cb91d0a1 WatchSource:0}: Error finding container 3f0d8ddd2a3a9f81bd3adafcb5415de5181db2147ba6d28cd8483b14cb91d0a1: Status 404 returned error can't find the container with id 3f0d8ddd2a3a9f81bd3adafcb5415de5181db2147ba6d28cd8483b14cb91d0a1 Mar 14 09:41:55 crc kubenswrapper[4956]: I0314 09:41:55.424038 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:41:55 crc kubenswrapper[4956]: I0314 09:41:55.424366 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:41:55 crc kubenswrapper[4956]: I0314 09:41:55.531453 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerStarted","Data":"3f0d8ddd2a3a9f81bd3adafcb5415de5181db2147ba6d28cd8483b14cb91d0a1"} Mar 14 09:41:55 crc kubenswrapper[4956]: I0314 09:41:55.646065 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.573354 4956 scope.go:117] "RemoveContainer" containerID="9d2f1d2c6d31fbb3c8b5a14804de445ff867d0054bc3dc64075edec8f4ace304" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.600754 4956 scope.go:117] "RemoveContainer" containerID="ff9b932a913ce2d3b11e3722e0544935f1ad11da47da4716cd4b0c8233c8b32b" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.658080 4956 scope.go:117] "RemoveContainer" containerID="1fa6ef6d1e7d6f3331899d16439ca90badd0f43a14f358294827f8d568e81a60" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.706639 4956 scope.go:117] "RemoveContainer" containerID="162b46697deba9d151b09ac46230c77e3913f17249fc1a405592489c8669d5b5" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.738584 4956 scope.go:117] "RemoveContainer" containerID="cedd2ef91032e570c2da4cac0c600ac8256b8f8cdaf02cc5f35e4b95cbdda7e9" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.781931 4956 scope.go:117] "RemoveContainer" containerID="0cb602bcda56debdcb4f862f2c5a27238fa03d5f9a69c722685b1a2ecf47fe87" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.853524 4956 scope.go:117] "RemoveContainer" containerID="ad78893c6fe976e2b6214171ec481a61342b2acfab97a066bc1395af18228014" Mar 14 09:41:56 crc kubenswrapper[4956]: I0314 09:41:56.876564 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:57 crc kubenswrapper[4956]: I0314 09:41:57.551456 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerStarted","Data":"168bb444d67e401128b5f0b814b66adaa787fde231fc18eddfae2e159454fefb"} Mar 14 09:41:58 crc kubenswrapper[4956]: I0314 09:41:58.070954 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:58 crc kubenswrapper[4956]: I0314 09:41:58.563649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerStarted","Data":"074554d7db856e2f5225e85249d9133cf1b722c215b515c276018f37ff8c96ba"} Mar 14 09:41:59 crc kubenswrapper[4956]: I0314 09:41:59.242407 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:41:59 crc kubenswrapper[4956]: I0314 09:41:59.575493 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerStarted","Data":"21564abf7068131a17259f9cd7f84410ce38d1d6c9be0976c17eeb22b5673b84"} Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.139294 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558022-ngvx2"] Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.140429 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.146349 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.146581 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.149153 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.149360 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-ngvx2"] Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.222948 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k7fc\" (UniqueName: \"kubernetes.io/projected/8def780b-f037-47da-ab01-83139e1a44b3-kube-api-access-2k7fc\") pod \"auto-csr-approver-29558022-ngvx2\" (UID: \"8def780b-f037-47da-ab01-83139e1a44b3\") " pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.325583 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k7fc\" (UniqueName: \"kubernetes.io/projected/8def780b-f037-47da-ab01-83139e1a44b3-kube-api-access-2k7fc\") pod \"auto-csr-approver-29558022-ngvx2\" (UID: \"8def780b-f037-47da-ab01-83139e1a44b3\") " pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.352757 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k7fc\" (UniqueName: \"kubernetes.io/projected/8def780b-f037-47da-ab01-83139e1a44b3-kube-api-access-2k7fc\") pod \"auto-csr-approver-29558022-ngvx2\" (UID: \"8def780b-f037-47da-ab01-83139e1a44b3\") " pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.429055 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:00 crc kubenswrapper[4956]: I0314 09:42:00.470722 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:01 crc kubenswrapper[4956]: I0314 09:42:01.601339 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:02 crc kubenswrapper[4956]: I0314 09:42:02.804676 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:03 crc kubenswrapper[4956]: I0314 09:42:03.990402 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:05 crc kubenswrapper[4956]: I0314 09:42:05.153889 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:06 crc kubenswrapper[4956]: I0314 09:42:06.359575 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:07 crc kubenswrapper[4956]: I0314 09:42:07.520430 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:08 crc kubenswrapper[4956]: I0314 09:42:08.705057 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:09 crc kubenswrapper[4956]: E0314 09:42:09.117178 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 14 09:42:09 crc kubenswrapper[4956]: E0314 09:42:09.117852 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gbcgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-l5tm2_watcher-kuttl-default(72dd9b23-3681-4a76-bc25-a2271e65d0aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 09:42:09 crc kubenswrapper[4956]: E0314 09:42:09.119660 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" podUID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" Mar 14 09:42:09 crc kubenswrapper[4956]: I0314 09:42:09.519472 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-ngvx2"] Mar 14 09:42:09 crc kubenswrapper[4956]: W0314 09:42:09.524652 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8def780b_f037_47da_ab01_83139e1a44b3.slice/crio-fb62f7da5fc186c9a34453e0493bf877f913dc2f4d665286debaa9642f767d1b WatchSource:0}: Error finding container fb62f7da5fc186c9a34453e0493bf877f913dc2f4d665286debaa9642f767d1b: Status 404 returned error can't find the container with id fb62f7da5fc186c9a34453e0493bf877f913dc2f4d665286debaa9642f767d1b Mar 14 09:42:09 crc kubenswrapper[4956]: I0314 09:42:09.687555 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerStarted","Data":"8ea674bdc5e5ead930de580fd1719fe8702d827caca10b02bd5991162db898d4"} Mar 14 09:42:09 crc kubenswrapper[4956]: I0314 09:42:09.687686 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:09 crc kubenswrapper[4956]: I0314 09:42:09.689027 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" event={"ID":"8def780b-f037-47da-ab01-83139e1a44b3","Type":"ContainerStarted","Data":"fb62f7da5fc186c9a34453e0493bf877f913dc2f4d665286debaa9642f767d1b"} Mar 14 09:42:09 crc kubenswrapper[4956]: E0314 09:42:09.690433 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" podUID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" Mar 14 09:42:09 crc kubenswrapper[4956]: I0314 09:42:09.718539 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.8962423720000001 podStartE2EDuration="15.718521137s" podCreationTimestamp="2026-03-14 09:41:54 +0000 UTC" firstStartedPulling="2026-03-14 09:41:55.351391992 +0000 UTC m=+2720.864084260" lastFinishedPulling="2026-03-14 09:42:09.173670757 +0000 UTC m=+2734.686363025" observedRunningTime="2026-03-14 09:42:09.710175526 +0000 UTC m=+2735.222867784" watchObservedRunningTime="2026-03-14 09:42:09.718521137 +0000 UTC m=+2735.231213405" Mar 14 09:42:09 crc kubenswrapper[4956]: I0314 09:42:09.887147 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:11 crc kubenswrapper[4956]: I0314 09:42:11.052249 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:11 crc kubenswrapper[4956]: I0314 09:42:11.706768 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" event={"ID":"8def780b-f037-47da-ab01-83139e1a44b3","Type":"ContainerStarted","Data":"222a4a8571366504790b1be5a3e7932b0b43e132551191bb7958df6f1a230119"} Mar 14 09:42:11 crc kubenswrapper[4956]: I0314 09:42:11.727038 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" podStartSLOduration=10.800389181 podStartE2EDuration="11.727014644s" podCreationTimestamp="2026-03-14 09:42:00 +0000 UTC" firstStartedPulling="2026-03-14 09:42:09.527081222 +0000 UTC m=+2735.039773490" lastFinishedPulling="2026-03-14 09:42:10.453706685 +0000 UTC m=+2735.966398953" observedRunningTime="2026-03-14 09:42:11.720633492 +0000 UTC m=+2737.233325810" watchObservedRunningTime="2026-03-14 09:42:11.727014644 +0000 UTC m=+2737.239706922" Mar 14 09:42:12 crc kubenswrapper[4956]: I0314 09:42:12.277394 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:12 crc kubenswrapper[4956]: I0314 09:42:12.714959 4956 generic.go:334] "Generic (PLEG): container finished" podID="8def780b-f037-47da-ab01-83139e1a44b3" containerID="222a4a8571366504790b1be5a3e7932b0b43e132551191bb7958df6f1a230119" exitCode=0 Mar 14 09:42:12 crc kubenswrapper[4956]: I0314 09:42:12.714998 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" event={"ID":"8def780b-f037-47da-ab01-83139e1a44b3","Type":"ContainerDied","Data":"222a4a8571366504790b1be5a3e7932b0b43e132551191bb7958df6f1a230119"} Mar 14 09:42:13 crc kubenswrapper[4956]: I0314 09:42:13.418098 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.048992 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.193571 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k7fc\" (UniqueName: \"kubernetes.io/projected/8def780b-f037-47da-ab01-83139e1a44b3-kube-api-access-2k7fc\") pod \"8def780b-f037-47da-ab01-83139e1a44b3\" (UID: \"8def780b-f037-47da-ab01-83139e1a44b3\") " Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.199379 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8def780b-f037-47da-ab01-83139e1a44b3-kube-api-access-2k7fc" (OuterVolumeSpecName: "kube-api-access-2k7fc") pod "8def780b-f037-47da-ab01-83139e1a44b3" (UID: "8def780b-f037-47da-ab01-83139e1a44b3"). InnerVolumeSpecName "kube-api-access-2k7fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.296144 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k7fc\" (UniqueName: \"kubernetes.io/projected/8def780b-f037-47da-ab01-83139e1a44b3-kube-api-access-2k7fc\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.601742 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.740750 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" event={"ID":"8def780b-f037-47da-ab01-83139e1a44b3","Type":"ContainerDied","Data":"fb62f7da5fc186c9a34453e0493bf877f913dc2f4d665286debaa9642f767d1b"} Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.740790 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb62f7da5fc186c9a34453e0493bf877f913dc2f4d665286debaa9642f767d1b" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.740848 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-ngvx2" Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.786629 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-2cg4f"] Mar 14 09:42:14 crc kubenswrapper[4956]: I0314 09:42:14.793204 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-2cg4f"] Mar 14 09:42:15 crc kubenswrapper[4956]: I0314 09:42:15.219200 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414a44f7-a0b0-4381-b2be-0f118c9c1106" path="/var/lib/kubelet/pods/414a44f7-a0b0-4381-b2be-0f118c9c1106/volumes" Mar 14 09:42:15 crc kubenswrapper[4956]: I0314 09:42:15.797866 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:16 crc kubenswrapper[4956]: I0314 09:42:16.965684 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:18 crc kubenswrapper[4956]: I0314 09:42:18.128727 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:19 crc kubenswrapper[4956]: I0314 09:42:19.320573 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:20 crc kubenswrapper[4956]: I0314 09:42:20.501128 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:21 crc kubenswrapper[4956]: I0314 09:42:21.710804 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:22 crc kubenswrapper[4956]: I0314 09:42:22.872905 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:24 crc kubenswrapper[4956]: I0314 09:42:24.066208 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:24 crc kubenswrapper[4956]: I0314 09:42:24.919357 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:25 crc kubenswrapper[4956]: I0314 09:42:25.267751 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:25 crc kubenswrapper[4956]: I0314 09:42:25.423798 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:42:25 crc kubenswrapper[4956]: I0314 09:42:25.424190 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:42:25 crc kubenswrapper[4956]: I0314 09:42:25.832881 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" event={"ID":"72dd9b23-3681-4a76-bc25-a2271e65d0aa","Type":"ContainerStarted","Data":"af87f18ef30904aa0e3c3acccabb089142350d73082840a4d380c92b738945e8"} Mar 14 09:42:25 crc kubenswrapper[4956]: I0314 09:42:25.849936 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" podStartSLOduration=2.198174174 podStartE2EDuration="32.849915447s" podCreationTimestamp="2026-03-14 09:41:53 +0000 UTC" firstStartedPulling="2026-03-14 09:41:54.006391238 +0000 UTC m=+2719.519083506" lastFinishedPulling="2026-03-14 09:42:24.658132511 +0000 UTC m=+2750.170824779" observedRunningTime="2026-03-14 09:42:25.848605534 +0000 UTC m=+2751.361297842" watchObservedRunningTime="2026-03-14 09:42:25.849915447 +0000 UTC m=+2751.362607715" Mar 14 09:42:26 crc kubenswrapper[4956]: I0314 09:42:26.437189 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:27 crc kubenswrapper[4956]: I0314 09:42:27.636370 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:28 crc kubenswrapper[4956]: I0314 09:42:28.836044 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:29 crc kubenswrapper[4956]: I0314 09:42:29.863458 4956 generic.go:334] "Generic (PLEG): container finished" podID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" containerID="af87f18ef30904aa0e3c3acccabb089142350d73082840a4d380c92b738945e8" exitCode=0 Mar 14 09:42:29 crc kubenswrapper[4956]: I0314 09:42:29.863535 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" event={"ID":"72dd9b23-3681-4a76-bc25-a2271e65d0aa","Type":"ContainerDied","Data":"af87f18ef30904aa0e3c3acccabb089142350d73082840a4d380c92b738945e8"} Mar 14 09:42:30 crc kubenswrapper[4956]: I0314 09:42:30.022821 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.224727 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.261408 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.374403 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbcgv\" (UniqueName: \"kubernetes.io/projected/72dd9b23-3681-4a76-bc25-a2271e65d0aa-kube-api-access-gbcgv\") pod \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.374461 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-scripts\") pod \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.374587 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-combined-ca-bundle\") pod \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.374649 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-config-data\") pod \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.374691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-db-sync-config-data\") pod \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.374715 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72dd9b23-3681-4a76-bc25-a2271e65d0aa-etc-machine-id\") pod \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\" (UID: \"72dd9b23-3681-4a76-bc25-a2271e65d0aa\") " Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.375083 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72dd9b23-3681-4a76-bc25-a2271e65d0aa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "72dd9b23-3681-4a76-bc25-a2271e65d0aa" (UID: "72dd9b23-3681-4a76-bc25-a2271e65d0aa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.380741 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72dd9b23-3681-4a76-bc25-a2271e65d0aa" (UID: "72dd9b23-3681-4a76-bc25-a2271e65d0aa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.380957 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-scripts" (OuterVolumeSpecName: "scripts") pod "72dd9b23-3681-4a76-bc25-a2271e65d0aa" (UID: "72dd9b23-3681-4a76-bc25-a2271e65d0aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.381661 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72dd9b23-3681-4a76-bc25-a2271e65d0aa-kube-api-access-gbcgv" (OuterVolumeSpecName: "kube-api-access-gbcgv") pod "72dd9b23-3681-4a76-bc25-a2271e65d0aa" (UID: "72dd9b23-3681-4a76-bc25-a2271e65d0aa"). InnerVolumeSpecName "kube-api-access-gbcgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.397826 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72dd9b23-3681-4a76-bc25-a2271e65d0aa" (UID: "72dd9b23-3681-4a76-bc25-a2271e65d0aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.416036 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-config-data" (OuterVolumeSpecName: "config-data") pod "72dd9b23-3681-4a76-bc25-a2271e65d0aa" (UID: "72dd9b23-3681-4a76-bc25-a2271e65d0aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.477837 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.477926 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.477938 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72dd9b23-3681-4a76-bc25-a2271e65d0aa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.477950 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbcgv\" (UniqueName: \"kubernetes.io/projected/72dd9b23-3681-4a76-bc25-a2271e65d0aa-kube-api-access-gbcgv\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.477997 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.478012 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd9b23-3681-4a76-bc25-a2271e65d0aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.885145 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" event={"ID":"72dd9b23-3681-4a76-bc25-a2271e65d0aa","Type":"ContainerDied","Data":"2446793245fab3811933f925cd42daf98700de4848932fb88fe49b68bb307222"} Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.885202 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2446793245fab3811933f925cd42daf98700de4848932fb88fe49b68bb307222" Mar 14 09:42:31 crc kubenswrapper[4956]: I0314 09:42:31.885285 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-l5tm2" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.212673 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:32 crc kubenswrapper[4956]: E0314 09:42:32.213088 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8def780b-f037-47da-ab01-83139e1a44b3" containerName="oc" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.213111 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8def780b-f037-47da-ab01-83139e1a44b3" containerName="oc" Mar 14 09:42:32 crc kubenswrapper[4956]: E0314 09:42:32.213156 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" containerName="cinder-db-sync" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.213166 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" containerName="cinder-db-sync" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.213355 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" containerName="cinder-db-sync" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.213392 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8def780b-f037-47da-ab01-83139e1a44b3" containerName="oc" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.214539 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.216388 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-m88cn" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.216541 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.217744 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.222319 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.224920 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.230050 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.232920 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.259698 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.269500 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.392807 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-lib-modules\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.392880 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-run\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.395363 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-scripts\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.395580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj8w2\" (UniqueName: \"kubernetes.io/projected/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-kube-api-access-lj8w2\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.395863 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.395952 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qsx\" (UniqueName: \"kubernetes.io/projected/697b007c-6fbc-4031-ae2b-3ef786c7dab4-kube-api-access-g9qsx\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396231 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396301 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396393 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-dev\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396469 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396517 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396578 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396626 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396710 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396795 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396821 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-scripts\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396841 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396865 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/697b007c-6fbc-4031-ae2b-3ef786c7dab4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396886 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.396955 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-sys\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.397434 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.402011 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.433087 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.434871 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.437006 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.453627 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499196 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499269 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499311 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499350 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499374 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-scripts\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499393 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/697b007c-6fbc-4031-ae2b-3ef786c7dab4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499438 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499467 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-sys\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499542 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-lib-modules\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499570 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-run\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-scripts\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499622 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj8w2\" (UniqueName: \"kubernetes.io/projected/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-kube-api-access-lj8w2\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499658 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499711 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qsx\" (UniqueName: \"kubernetes.io/projected/697b007c-6fbc-4031-ae2b-3ef786c7dab4-kube-api-access-g9qsx\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.499756 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.500517 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-sys\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.500559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.500657 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.500690 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.505107 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.505572 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-scripts\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.508976 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.509050 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-lib-modules\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.509243 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.509297 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-run\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.509374 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/697b007c-6fbc-4031-ae2b-3ef786c7dab4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.509481 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.509626 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510096 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510264 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-dev\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510426 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510467 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510599 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.510652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-dev\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.511021 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.515732 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.516155 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.518434 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.520175 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.523300 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.531444 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qsx\" (UniqueName: \"kubernetes.io/projected/697b007c-6fbc-4031-ae2b-3ef786c7dab4-kube-api-access-g9qsx\") pod \"cinder-scheduler-0\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.534540 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.537248 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj8w2\" (UniqueName: \"kubernetes.io/projected/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-kube-api-access-lj8w2\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.541698 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-scripts\") pod \"cinder-backup-0\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.554304 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612515 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gxhs\" (UniqueName: \"kubernetes.io/projected/5c4efdb8-3eac-4467-b226-523349c43b99-kube-api-access-6gxhs\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612568 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c4efdb8-3eac-4467-b226-523349c43b99-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-scripts\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612629 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612663 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612722 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612750 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.612765 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c4efdb8-3eac-4467-b226-523349c43b99-logs\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714505 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714563 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714584 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c4efdb8-3eac-4467-b226-523349c43b99-logs\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714625 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gxhs\" (UniqueName: \"kubernetes.io/projected/5c4efdb8-3eac-4467-b226-523349c43b99-kube-api-access-6gxhs\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714648 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c4efdb8-3eac-4467-b226-523349c43b99-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714666 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-scripts\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714714 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.714754 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.720361 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c4efdb8-3eac-4467-b226-523349c43b99-logs\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.720820 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.720911 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c4efdb8-3eac-4467-b226-523349c43b99-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.725421 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-scripts\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.725871 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.726048 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.727955 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.750972 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gxhs\" (UniqueName: \"kubernetes.io/projected/5c4efdb8-3eac-4467-b226-523349c43b99-kube-api-access-6gxhs\") pod \"cinder-api-0\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:32 crc kubenswrapper[4956]: I0314 09:42:32.754802 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.193604 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.283554 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:33 crc kubenswrapper[4956]: W0314 09:42:33.284189 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod697b007c_6fbc_4031_ae2b_3ef786c7dab4.slice/crio-b1f51c8da9329654ccaafe283125eb60d7a119ad3410d62e6d53ae7224f0f1e1 WatchSource:0}: Error finding container b1f51c8da9329654ccaafe283125eb60d7a119ad3410d62e6d53ae7224f0f1e1: Status 404 returned error can't find the container with id b1f51c8da9329654ccaafe283125eb60d7a119ad3410d62e6d53ae7224f0f1e1 Mar 14 09:42:33 crc kubenswrapper[4956]: W0314 09:42:33.432458 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c4efdb8_3eac_4467_b226_523349c43b99.slice/crio-6353937e46251c30f23e56f537094259167e5685e647b6deab199ae247a4d6b0 WatchSource:0}: Error finding container 6353937e46251c30f23e56f537094259167e5685e647b6deab199ae247a4d6b0: Status 404 returned error can't find the container with id 6353937e46251c30f23e56f537094259167e5685e647b6deab199ae247a4d6b0 Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.433101 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.599323 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.925416 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"697b007c-6fbc-4031-ae2b-3ef786c7dab4","Type":"ContainerStarted","Data":"b1f51c8da9329654ccaafe283125eb60d7a119ad3410d62e6d53ae7224f0f1e1"} Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.926259 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527","Type":"ContainerStarted","Data":"6773de11469e45f27d024e866fe2700910cea3fc8cf0d016a05225a4ef59bd94"} Mar 14 09:42:33 crc kubenswrapper[4956]: I0314 09:42:33.927037 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5c4efdb8-3eac-4467-b226-523349c43b99","Type":"ContainerStarted","Data":"6353937e46251c30f23e56f537094259167e5685e647b6deab199ae247a4d6b0"} Mar 14 09:42:34 crc kubenswrapper[4956]: I0314 09:42:34.080811 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:34 crc kubenswrapper[4956]: I0314 09:42:34.849465 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:34 crc kubenswrapper[4956]: I0314 09:42:34.972599 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527","Type":"ContainerStarted","Data":"721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d"} Mar 14 09:42:34 crc kubenswrapper[4956]: I0314 09:42:34.973043 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527","Type":"ContainerStarted","Data":"64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b"} Mar 14 09:42:34 crc kubenswrapper[4956]: I0314 09:42:34.979106 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5c4efdb8-3eac-4467-b226-523349c43b99","Type":"ContainerStarted","Data":"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533"} Mar 14 09:42:35 crc kubenswrapper[4956]: I0314 09:42:35.013548 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=2.261738947 podStartE2EDuration="3.013520886s" podCreationTimestamp="2026-03-14 09:42:32 +0000 UTC" firstStartedPulling="2026-03-14 09:42:33.202308102 +0000 UTC m=+2758.715000370" lastFinishedPulling="2026-03-14 09:42:33.954090041 +0000 UTC m=+2759.466782309" observedRunningTime="2026-03-14 09:42:35.007978876 +0000 UTC m=+2760.520671154" watchObservedRunningTime="2026-03-14 09:42:35.013520886 +0000 UTC m=+2760.526213154" Mar 14 09:42:35 crc kubenswrapper[4956]: I0314 09:42:35.994281 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5c4efdb8-3eac-4467-b226-523349c43b99","Type":"ContainerStarted","Data":"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163"} Mar 14 09:42:35 crc kubenswrapper[4956]: I0314 09:42:35.994623 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:35 crc kubenswrapper[4956]: I0314 09:42:35.994544 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api" containerID="cri-o://919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163" gracePeriod=30 Mar 14 09:42:35 crc kubenswrapper[4956]: I0314 09:42:35.994452 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api-log" containerID="cri-o://247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533" gracePeriod=30 Mar 14 09:42:35 crc kubenswrapper[4956]: I0314 09:42:35.998560 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"697b007c-6fbc-4031-ae2b-3ef786c7dab4","Type":"ContainerStarted","Data":"57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847"} Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.019817 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=4.019801235 podStartE2EDuration="4.019801235s" podCreationTimestamp="2026-03-14 09:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:42:36.015986868 +0000 UTC m=+2761.528679136" watchObservedRunningTime="2026-03-14 09:42:36.019801235 +0000 UTC m=+2761.532493503" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.068868 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.713027 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.806535 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.806660 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c4efdb8-3eac-4467-b226-523349c43b99-etc-machine-id\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.806684 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-scripts\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.806727 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c4efdb8-3eac-4467-b226-523349c43b99-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.806746 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-cert-memcached-mtls\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.806860 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-combined-ca-bundle\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.807125 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c4efdb8-3eac-4467-b226-523349c43b99-logs\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.807227 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data-custom\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.807319 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gxhs\" (UniqueName: \"kubernetes.io/projected/5c4efdb8-3eac-4467-b226-523349c43b99-kube-api-access-6gxhs\") pod \"5c4efdb8-3eac-4467-b226-523349c43b99\" (UID: \"5c4efdb8-3eac-4467-b226-523349c43b99\") " Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.807891 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4efdb8-3eac-4467-b226-523349c43b99-logs" (OuterVolumeSpecName: "logs") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.809914 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c4efdb8-3eac-4467-b226-523349c43b99-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.809941 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c4efdb8-3eac-4467-b226-523349c43b99-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.814409 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4efdb8-3eac-4467-b226-523349c43b99-kube-api-access-6gxhs" (OuterVolumeSpecName: "kube-api-access-6gxhs") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "kube-api-access-6gxhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.814608 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-scripts" (OuterVolumeSpecName: "scripts") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.815402 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.843830 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.864642 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data" (OuterVolumeSpecName: "config-data") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.882270 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "5c4efdb8-3eac-4467-b226-523349c43b99" (UID: "5c4efdb8-3eac-4467-b226-523349c43b99"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.911531 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.911565 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gxhs\" (UniqueName: \"kubernetes.io/projected/5c4efdb8-3eac-4467-b226-523349c43b99-kube-api-access-6gxhs\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.911599 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.911607 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.911615 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:36 crc kubenswrapper[4956]: I0314 09:42:36.911623 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4efdb8-3eac-4467-b226-523349c43b99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.018944 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.018972 4956 generic.go:334] "Generic (PLEG): container finished" podID="5c4efdb8-3eac-4467-b226-523349c43b99" containerID="919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163" exitCode=0 Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.018989 4956 generic.go:334] "Generic (PLEG): container finished" podID="5c4efdb8-3eac-4467-b226-523349c43b99" containerID="247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533" exitCode=143 Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.018942 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5c4efdb8-3eac-4467-b226-523349c43b99","Type":"ContainerDied","Data":"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163"} Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.019104 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5c4efdb8-3eac-4467-b226-523349c43b99","Type":"ContainerDied","Data":"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533"} Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.019119 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"5c4efdb8-3eac-4467-b226-523349c43b99","Type":"ContainerDied","Data":"6353937e46251c30f23e56f537094259167e5685e647b6deab199ae247a4d6b0"} Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.019163 4956 scope.go:117] "RemoveContainer" containerID="919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.021496 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"697b007c-6fbc-4031-ae2b-3ef786c7dab4","Type":"ContainerStarted","Data":"72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054"} Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.044365 4956 scope.go:117] "RemoveContainer" containerID="247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.046433 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=3.9212461 podStartE2EDuration="5.04642211s" podCreationTimestamp="2026-03-14 09:42:32 +0000 UTC" firstStartedPulling="2026-03-14 09:42:33.28954928 +0000 UTC m=+2758.802241548" lastFinishedPulling="2026-03-14 09:42:34.4147253 +0000 UTC m=+2759.927417558" observedRunningTime="2026-03-14 09:42:37.045172608 +0000 UTC m=+2762.557864876" watchObservedRunningTime="2026-03-14 09:42:37.04642211 +0000 UTC m=+2762.559114378" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.067697 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.078360 4956 scope.go:117] "RemoveContainer" containerID="919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.079118 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:37 crc kubenswrapper[4956]: E0314 09:42:37.079313 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163\": container with ID starting with 919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163 not found: ID does not exist" containerID="919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.079363 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163"} err="failed to get container status \"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163\": rpc error: code = NotFound desc = could not find container \"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163\": container with ID starting with 919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163 not found: ID does not exist" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.079393 4956 scope.go:117] "RemoveContainer" containerID="247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533" Mar 14 09:42:37 crc kubenswrapper[4956]: E0314 09:42:37.082802 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533\": container with ID starting with 247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533 not found: ID does not exist" containerID="247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.082846 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533"} err="failed to get container status \"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533\": rpc error: code = NotFound desc = could not find container \"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533\": container with ID starting with 247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533 not found: ID does not exist" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.082871 4956 scope.go:117] "RemoveContainer" containerID="919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.084106 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163"} err="failed to get container status \"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163\": rpc error: code = NotFound desc = could not find container \"919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163\": container with ID starting with 919760a232c9df2ca7c2cac34382a1c0ae523629429fb8d08f4e679b8b404163 not found: ID does not exist" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.084131 4956 scope.go:117] "RemoveContainer" containerID="247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.084851 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533"} err="failed to get container status \"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533\": rpc error: code = NotFound desc = could not find container \"247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533\": container with ID starting with 247af9545984645c50b3977ea4a9053d1ad3f506456b3ce00ff3688753dc2533 not found: ID does not exist" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.104526 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:37 crc kubenswrapper[4956]: E0314 09:42:37.104973 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api-log" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.104989 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api-log" Mar 14 09:42:37 crc kubenswrapper[4956]: E0314 09:42:37.105000 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.105006 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.105173 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.105200 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" containerName="cinder-api-log" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.106144 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.108217 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-cinder-internal-svc" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.108377 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-cinder-public-svc" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.108380 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.115408 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215365 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be8317c5-1327-4216-bf3f-400253bdfa3c-logs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215401 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215505 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-scripts\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215552 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknw8\" (UniqueName: \"kubernetes.io/projected/be8317c5-1327-4216-bf3f-400253bdfa3c-kube-api-access-dknw8\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215571 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215638 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.215696 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be8317c5-1327-4216-bf3f-400253bdfa3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.218222 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4efdb8-3eac-4467-b226-523349c43b99" path="/var/lib/kubelet/pods/5c4efdb8-3eac-4467-b226-523349c43b99/volumes" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.283159 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.317839 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be8317c5-1327-4216-bf3f-400253bdfa3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.317946 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.317990 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be8317c5-1327-4216-bf3f-400253bdfa3c-logs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318015 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318055 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318078 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-scripts\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318107 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dknw8\" (UniqueName: \"kubernetes.io/projected/be8317c5-1327-4216-bf3f-400253bdfa3c-kube-api-access-dknw8\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318141 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318177 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318926 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be8317c5-1327-4216-bf3f-400253bdfa3c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.318960 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be8317c5-1327-4216-bf3f-400253bdfa3c-logs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.322446 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.323475 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.324653 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-scripts\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.325958 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.329001 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data-custom\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.335109 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.335114 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.338537 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknw8\" (UniqueName: \"kubernetes.io/projected/be8317c5-1327-4216-bf3f-400253bdfa3c-kube-api-access-dknw8\") pod \"cinder-api-0\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.420254 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.535449 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:37 crc kubenswrapper[4956]: I0314 09:42:37.555767 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:38 crc kubenswrapper[4956]: I0314 09:42:38.541776 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:39 crc kubenswrapper[4956]: I0314 09:42:39.728448 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:39 crc kubenswrapper[4956]: I0314 09:42:39.888696 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:40 crc kubenswrapper[4956]: I0314 09:42:40.055908 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"be8317c5-1327-4216-bf3f-400253bdfa3c","Type":"ContainerStarted","Data":"e5a75031b3babd6966f7fe673ad19d4493825a58f18f7206dd63adc1a93730ce"} Mar 14 09:42:40 crc kubenswrapper[4956]: I0314 09:42:40.914394 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:41 crc kubenswrapper[4956]: I0314 09:42:41.066786 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"be8317c5-1327-4216-bf3f-400253bdfa3c","Type":"ContainerStarted","Data":"569c9fa74d528637f1f5a56e76415682e4de7000b1e42d582e04fd714e5dea15"} Mar 14 09:42:41 crc kubenswrapper[4956]: I0314 09:42:41.066851 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"be8317c5-1327-4216-bf3f-400253bdfa3c","Type":"ContainerStarted","Data":"8a8784e10b8f2c58186aede58c8d97b8877e2b9b542f46aab35dbf186364889a"} Mar 14 09:42:41 crc kubenswrapper[4956]: I0314 09:42:41.066924 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:41 crc kubenswrapper[4956]: I0314 09:42:41.084394 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=4.084375404 podStartE2EDuration="4.084375404s" podCreationTimestamp="2026-03-14 09:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:42:41.083461121 +0000 UTC m=+2766.596153419" watchObservedRunningTime="2026-03-14 09:42:41.084375404 +0000 UTC m=+2766.597067672" Mar 14 09:42:42 crc kubenswrapper[4956]: I0314 09:42:42.086599 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:42 crc kubenswrapper[4956]: I0314 09:42:42.739825 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:42 crc kubenswrapper[4956]: I0314 09:42:42.781022 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:42 crc kubenswrapper[4956]: I0314 09:42:42.792029 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:42 crc kubenswrapper[4956]: I0314 09:42:42.847617 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:43 crc kubenswrapper[4956]: I0314 09:42:43.092420 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="cinder-scheduler" containerID="cri-o://57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847" gracePeriod=30 Mar 14 09:42:43 crc kubenswrapper[4956]: I0314 09:42:43.092612 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="probe" containerID="cri-o://72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054" gracePeriod=30 Mar 14 09:42:43 crc kubenswrapper[4956]: I0314 09:42:43.092699 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="probe" containerID="cri-o://721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d" gracePeriod=30 Mar 14 09:42:43 crc kubenswrapper[4956]: I0314 09:42:43.092709 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="cinder-backup" containerID="cri-o://64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b" gracePeriod=30 Mar 14 09:42:43 crc kubenswrapper[4956]: I0314 09:42:43.314199 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.101764 4956 generic.go:334] "Generic (PLEG): container finished" podID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerID="721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d" exitCode=0 Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.101841 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527","Type":"ContainerDied","Data":"721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d"} Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.104100 4956 generic.go:334] "Generic (PLEG): container finished" podID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerID="72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054" exitCode=0 Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.104155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"697b007c-6fbc-4031-ae2b-3ef786c7dab4","Type":"ContainerDied","Data":"72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054"} Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.407524 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.407819 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" containerName="watcher-decision-engine" containerID="cri-o://64abe770ac3c8873baddb1131c39b9e1338f98b06403d389b78593e1c547a6be" gracePeriod=30 Mar 14 09:42:44 crc kubenswrapper[4956]: I0314 09:42:44.524111 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.040705 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.046679 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.120683 4956 generic.go:334] "Generic (PLEG): container finished" podID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerID="64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b" exitCode=0 Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.120745 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527","Type":"ContainerDied","Data":"64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b"} Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.120775 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527","Type":"ContainerDied","Data":"6773de11469e45f27d024e866fe2700910cea3fc8cf0d016a05225a4ef59bd94"} Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.120792 4956 scope.go:117] "RemoveContainer" containerID="721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.120902 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.128091 4956 generic.go:334] "Generic (PLEG): container finished" podID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerID="57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847" exitCode=0 Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.128134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"697b007c-6fbc-4031-ae2b-3ef786c7dab4","Type":"ContainerDied","Data":"57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847"} Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.128160 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"697b007c-6fbc-4031-ae2b-3ef786c7dab4","Type":"ContainerDied","Data":"b1f51c8da9329654ccaafe283125eb60d7a119ad3410d62e6d53ae7224f0f1e1"} Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.128457 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.145226 4956 scope.go:117] "RemoveContainer" containerID="64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159254 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159335 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-iscsi\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159371 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data-custom\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159404 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/697b007c-6fbc-4031-ae2b-3ef786c7dab4-etc-machine-id\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159425 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9qsx\" (UniqueName: \"kubernetes.io/projected/697b007c-6fbc-4031-ae2b-3ef786c7dab4-kube-api-access-g9qsx\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159448 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj8w2\" (UniqueName: \"kubernetes.io/projected/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-kube-api-access-lj8w2\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159464 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-brick\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159496 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159517 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-cert-memcached-mtls\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159537 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-combined-ca-bundle\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159574 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-run\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159594 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-lib-cinder\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159624 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-scripts\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159663 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-sys\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159703 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-dev\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159726 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-scripts\") pod \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\" (UID: \"697b007c-6fbc-4031-ae2b-3ef786c7dab4\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159755 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-cert-memcached-mtls\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159798 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-lib-modules\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159843 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data-custom\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159871 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-cinder\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159892 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-machine-id\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159921 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-nvme\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.159953 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-combined-ca-bundle\") pod \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\" (UID: \"f9e454b6-b07d-4c7f-a40b-b35dc9ff9527\") " Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.162111 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.162176 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.162199 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.162830 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.165012 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-run" (OuterVolumeSpecName: "run") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.165022 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-sys" (OuterVolumeSpecName: "sys") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.165077 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.165124 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.165183 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.166324 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-scripts" (OuterVolumeSpecName: "scripts") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.166376 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/697b007c-6fbc-4031-ae2b-3ef786c7dab4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.166346 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697b007c-6fbc-4031-ae2b-3ef786c7dab4-kube-api-access-g9qsx" (OuterVolumeSpecName: "kube-api-access-g9qsx") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "kube-api-access-g9qsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.166430 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-dev" (OuterVolumeSpecName: "dev") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.167493 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.169612 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-scripts" (OuterVolumeSpecName: "scripts") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.173624 4956 scope.go:117] "RemoveContainer" containerID="721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.174453 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d\": container with ID starting with 721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d not found: ID does not exist" containerID="721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.174499 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d"} err="failed to get container status \"721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d\": rpc error: code = NotFound desc = could not find container \"721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d\": container with ID starting with 721156f5376f735888d07c7703c2089e5ce02e21fb07e7f22f1a14926ed21a8d not found: ID does not exist" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.174526 4956 scope.go:117] "RemoveContainer" containerID="64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.174788 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b\": container with ID starting with 64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b not found: ID does not exist" containerID="64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.174810 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b"} err="failed to get container status \"64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b\": rpc error: code = NotFound desc = could not find container \"64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b\": container with ID starting with 64a0aa04664f9708241b656876b97fc94ba586e6a3a188a6b8c2b3bda5559a4b not found: ID does not exist" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.174824 4956 scope.go:117] "RemoveContainer" containerID="72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.182740 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.183382 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-kube-api-access-lj8w2" (OuterVolumeSpecName: "kube-api-access-lj8w2") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "kube-api-access-lj8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.241443 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262085 4956 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262109 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262118 4956 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-sys\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262130 4956 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-dev\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262139 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262147 4956 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262156 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262166 4956 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262173 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262181 4956 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262188 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262196 4956 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262204 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262211 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/697b007c-6fbc-4031-ae2b-3ef786c7dab4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262220 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9qsx\" (UniqueName: \"kubernetes.io/projected/697b007c-6fbc-4031-ae2b-3ef786c7dab4-kube-api-access-g9qsx\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262229 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj8w2\" (UniqueName: \"kubernetes.io/projected/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-kube-api-access-lj8w2\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262237 4956 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.262244 4956 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-run\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.286576 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.324057 4956 scope.go:117] "RemoveContainer" containerID="57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.327750 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data" (OuterVolumeSpecName: "config-data") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.345464 4956 scope.go:117] "RemoveContainer" containerID="72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.346327 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054\": container with ID starting with 72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054 not found: ID does not exist" containerID="72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.346376 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054"} err="failed to get container status \"72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054\": rpc error: code = NotFound desc = could not find container \"72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054\": container with ID starting with 72f6ac381ecff25e58fc1391fdf97e24078a8b4f8e0ccb6ea2132328bca12054 not found: ID does not exist" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.346399 4956 scope.go:117] "RemoveContainer" containerID="57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.346668 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847\": container with ID starting with 57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847 not found: ID does not exist" containerID="57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.346695 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847"} err="failed to get container status \"57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847\": rpc error: code = NotFound desc = could not find container \"57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847\": container with ID starting with 57c8cf4ea95bcd8fe72c096c9040db255c24e4f4cee5c17d87ade6f6471c0847 not found: ID does not exist" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.361987 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data" (OuterVolumeSpecName: "config-data") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.362325 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" (UID: "f9e454b6-b07d-4c7f-a40b-b35dc9ff9527"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.364557 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.364607 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.364620 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.364633 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.364638 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "697b007c-6fbc-4031-ae2b-3ef786c7dab4" (UID: "697b007c-6fbc-4031-ae2b-3ef786c7dab4"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.458107 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.463631 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.466338 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/697b007c-6fbc-4031-ae2b-3ef786c7dab4-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.473828 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.484687 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491310 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.491686 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="probe" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491703 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="probe" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.491722 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="cinder-backup" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491729 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="cinder-backup" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.491746 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="probe" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491752 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="probe" Mar 14 09:42:45 crc kubenswrapper[4956]: E0314 09:42:45.491765 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="cinder-scheduler" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491771 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="cinder-scheduler" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491935 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="probe" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491960 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="probe" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491973 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" containerName="cinder-scheduler" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.491980 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" containerName="cinder-backup" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.492856 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.496259 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.499645 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.506497 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.507820 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.509313 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.529135 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.644649 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.644993 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-central-agent" containerID="cri-o://168bb444d67e401128b5f0b814b66adaa787fde231fc18eddfae2e159454fefb" gracePeriod=30 Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.645061 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-notification-agent" containerID="cri-o://074554d7db856e2f5225e85249d9133cf1b722c215b515c276018f37ff8c96ba" gracePeriod=30 Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.645105 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="sg-core" containerID="cri-o://21564abf7068131a17259f9cd7f84410ce38d1d6c9be0976c17eeb22b5673b84" gracePeriod=30 Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.645061 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="proxy-httpd" containerID="cri-o://8ea674bdc5e5ead930de580fd1719fe8702d827caca10b02bd5991162db898d4" gracePeriod=30 Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669523 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-dev\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-run\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669660 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsctn\" (UniqueName: \"kubernetes.io/projected/ee33bd5a-39fa-4066-8d18-06e1c12374b7-kube-api-access-fsctn\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669730 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2954f359-0b87-4748-bd2f-f16d3d5121a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669766 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669793 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669848 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data-custom\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669899 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669919 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669941 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc8t\" (UniqueName: \"kubernetes.io/projected/2954f359-0b87-4748-bd2f-f16d3d5121a4-kube-api-access-6xc8t\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.669958 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670001 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-lib-modules\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670019 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-scripts\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670037 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670067 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670099 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670114 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-nvme\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670154 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670171 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670278 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670300 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.670357 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-sys\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.720209 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772329 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772386 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772452 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772470 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772571 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772614 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-sys\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772676 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-dev\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772746 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-run\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772780 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsctn\" (UniqueName: \"kubernetes.io/projected/ee33bd5a-39fa-4066-8d18-06e1c12374b7-kube-api-access-fsctn\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772847 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2954f359-0b87-4748-bd2f-f16d3d5121a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772875 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772926 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.772953 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data-custom\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773018 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773050 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773128 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc8t\" (UniqueName: \"kubernetes.io/projected/2954f359-0b87-4748-bd2f-f16d3d5121a4-kube-api-access-6xc8t\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773187 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773224 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-lib-modules\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773288 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-scripts\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773319 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773411 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773576 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773682 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-nvme\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.773949 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-nvme\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.774048 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.774117 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.774711 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-lib-modules\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.775207 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-run\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.775575 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-sys\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.775950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-dev\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.776281 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.776336 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.776902 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2954f359-0b87-4748-bd2f-f16d3d5121a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.777402 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.781353 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.781414 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.781872 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.782005 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.782681 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.783461 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-scripts\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.783543 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data-custom\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.783797 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.787264 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.790866 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.794225 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsctn\" (UniqueName: \"kubernetes.io/projected/ee33bd5a-39fa-4066-8d18-06e1c12374b7-kube-api-access-fsctn\") pod \"cinder-backup-0\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.797844 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc8t\" (UniqueName: \"kubernetes.io/projected/2954f359-0b87-4748-bd2f-f16d3d5121a4-kube-api-access-6xc8t\") pod \"cinder-scheduler-0\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.808400 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:45 crc kubenswrapper[4956]: I0314 09:42:45.826470 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.138673 4956 generic.go:334] "Generic (PLEG): container finished" podID="8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" containerID="64abe770ac3c8873baddb1131c39b9e1338f98b06403d389b78593e1c547a6be" exitCode=0 Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.139351 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51","Type":"ContainerDied","Data":"64abe770ac3c8873baddb1131c39b9e1338f98b06403d389b78593e1c547a6be"} Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.145255 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerID="8ea674bdc5e5ead930de580fd1719fe8702d827caca10b02bd5991162db898d4" exitCode=0 Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.145298 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerID="21564abf7068131a17259f9cd7f84410ce38d1d6c9be0976c17eeb22b5673b84" exitCode=2 Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.145313 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerID="168bb444d67e401128b5f0b814b66adaa787fde231fc18eddfae2e159454fefb" exitCode=0 Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.145923 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerDied","Data":"8ea674bdc5e5ead930de580fd1719fe8702d827caca10b02bd5991162db898d4"} Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.145978 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerDied","Data":"21564abf7068131a17259f9cd7f84410ce38d1d6c9be0976c17eeb22b5673b84"} Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.145990 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerDied","Data":"168bb444d67e401128b5f0b814b66adaa787fde231fc18eddfae2e159454fefb"} Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.279385 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.381111 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-config-data\") pod \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.381497 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcr9v\" (UniqueName: \"kubernetes.io/projected/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-kube-api-access-fcr9v\") pod \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.381662 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-combined-ca-bundle\") pod \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.381690 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-custom-prometheus-ca\") pod \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.381754 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-logs\") pod \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.381776 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-cert-memcached-mtls\") pod \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\" (UID: \"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51\") " Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.382890 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.385231 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-logs" (OuterVolumeSpecName: "logs") pod "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" (UID: "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.394071 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.398363 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-kube-api-access-fcr9v" (OuterVolumeSpecName: "kube-api-access-fcr9v") pod "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" (UID: "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51"). InnerVolumeSpecName "kube-api-access-fcr9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.412255 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" (UID: "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.416079 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" (UID: "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.449622 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-config-data" (OuterVolumeSpecName: "config-data") pod "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" (UID: "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.483131 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.483164 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcr9v\" (UniqueName: \"kubernetes.io/projected/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-kube-api-access-fcr9v\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.483173 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.483182 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.483191 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.513188 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" (UID: "8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.584979 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:46 crc kubenswrapper[4956]: I0314 09:42:46.924365 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/watcher-decision-engine/0.log" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.164701 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"ee33bd5a-39fa-4066-8d18-06e1c12374b7","Type":"ContainerStarted","Data":"06194ba955e4cd5d8b04f2a331150de087ca37a02423781ab72e8e623f62d420"} Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.165602 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"ee33bd5a-39fa-4066-8d18-06e1c12374b7","Type":"ContainerStarted","Data":"4fcd7e33cfbaeb3912d96786bb02dd5d456cafa757c62fe556f3e211c0f5a827"} Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.171444 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51","Type":"ContainerDied","Data":"d9ecd5124c6effd229b32e6a64b2aeef13b96d62efd829358eabc0eed444a921"} Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.171527 4956 scope.go:117] "RemoveContainer" containerID="64abe770ac3c8873baddb1131c39b9e1338f98b06403d389b78593e1c547a6be" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.171722 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.184856 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"2954f359-0b87-4748-bd2f-f16d3d5121a4","Type":"ContainerStarted","Data":"74cb53279a160f340c28c86f6e609e1f7e379c7144c75648d336054ce41691fa"} Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.253668 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697b007c-6fbc-4031-ae2b-3ef786c7dab4" path="/var/lib/kubelet/pods/697b007c-6fbc-4031-ae2b-3ef786c7dab4/volumes" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.257925 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e454b6-b07d-4c7f-a40b-b35dc9ff9527" path="/var/lib/kubelet/pods/f9e454b6-b07d-4c7f-a40b-b35dc9ff9527/volumes" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.291599 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.329826 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.338629 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:42:47 crc kubenswrapper[4956]: E0314 09:42:47.339113 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" containerName="watcher-decision-engine" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.339142 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" containerName="watcher-decision-engine" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.339368 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" containerName="watcher-decision-engine" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.340116 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.345090 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.350952 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.400930 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfc67856-2753-47d2-8863-dec939316fd2-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.400991 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.401047 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.401084 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.401133 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbxn\" (UniqueName: \"kubernetes.io/projected/dfc67856-2753-47d2-8863-dec939316fd2-kube-api-access-vcbxn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.401158 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.503796 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.504904 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.505129 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbxn\" (UniqueName: \"kubernetes.io/projected/dfc67856-2753-47d2-8863-dec939316fd2-kube-api-access-vcbxn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.505251 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.505411 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfc67856-2753-47d2-8863-dec939316fd2-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.505568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.507030 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfc67856-2753-47d2-8863-dec939316fd2-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.510463 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.512211 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.512830 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.525668 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.532027 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbxn\" (UniqueName: \"kubernetes.io/projected/dfc67856-2753-47d2-8863-dec939316fd2-kube-api-access-vcbxn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:47 crc kubenswrapper[4956]: I0314 09:42:47.667501 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:48 crc kubenswrapper[4956]: I0314 09:42:48.183454 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:42:48 crc kubenswrapper[4956]: I0314 09:42:48.208553 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"2954f359-0b87-4748-bd2f-f16d3d5121a4","Type":"ContainerStarted","Data":"3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977"} Mar 14 09:42:48 crc kubenswrapper[4956]: I0314 09:42:48.208845 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"2954f359-0b87-4748-bd2f-f16d3d5121a4","Type":"ContainerStarted","Data":"6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7"} Mar 14 09:42:48 crc kubenswrapper[4956]: I0314 09:42:48.216586 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"ee33bd5a-39fa-4066-8d18-06e1c12374b7","Type":"ContainerStarted","Data":"ab1b972777088af64b581d52e000dcb4a97aa4f5768715cbfaaa39ba7fcb25b4"} Mar 14 09:42:48 crc kubenswrapper[4956]: I0314 09:42:48.258360 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=3.258339284 podStartE2EDuration="3.258339284s" podCreationTimestamp="2026-03-14 09:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:42:48.256025745 +0000 UTC m=+2773.768718013" watchObservedRunningTime="2026-03-14 09:42:48.258339284 +0000 UTC m=+2773.771031552" Mar 14 09:42:48 crc kubenswrapper[4956]: I0314 09:42:48.260445 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=3.260435667 podStartE2EDuration="3.260435667s" podCreationTimestamp="2026-03-14 09:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:42:48.227693878 +0000 UTC m=+2773.740386136" watchObservedRunningTime="2026-03-14 09:42:48.260435667 +0000 UTC m=+2773.773127935" Mar 14 09:42:49 crc kubenswrapper[4956]: I0314 09:42:49.253624 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51" path="/var/lib/kubelet/pods/8e3142ae-665e-40cc-9ee7-9c2ceb4f1e51/volumes" Mar 14 09:42:49 crc kubenswrapper[4956]: I0314 09:42:49.262185 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"dfc67856-2753-47d2-8863-dec939316fd2","Type":"ContainerStarted","Data":"2138362324a289a61f1abdb80a88d86ec50863ba3f271b4ed1dfdd791ac8b296"} Mar 14 09:42:49 crc kubenswrapper[4956]: I0314 09:42:49.262448 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"dfc67856-2753-47d2-8863-dec939316fd2","Type":"ContainerStarted","Data":"0a4be1389802e084cea09e5ca6d7dcd1024dc4312c3808befadf0614b8b485e5"} Mar 14 09:42:49 crc kubenswrapper[4956]: I0314 09:42:49.305396 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.305379535 podStartE2EDuration="2.305379535s" podCreationTimestamp="2026-03-14 09:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:42:49.303036866 +0000 UTC m=+2774.815729134" watchObservedRunningTime="2026-03-14 09:42:49.305379535 +0000 UTC m=+2774.818071803" Mar 14 09:42:49 crc kubenswrapper[4956]: I0314 09:42:49.350693 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:49 crc kubenswrapper[4956]: I0314 09:42:49.649723 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:42:50 crc kubenswrapper[4956]: I0314 09:42:50.547450 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:50 crc kubenswrapper[4956]: I0314 09:42:50.809206 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:50 crc kubenswrapper[4956]: I0314 09:42:50.827199 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:51 crc kubenswrapper[4956]: I0314 09:42:51.283848 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerID="074554d7db856e2f5225e85249d9133cf1b722c215b515c276018f37ff8c96ba" exitCode=0 Mar 14 09:42:51 crc kubenswrapper[4956]: I0314 09:42:51.283936 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerDied","Data":"074554d7db856e2f5225e85249d9133cf1b722c215b515c276018f37ff8c96ba"} Mar 14 09:42:51 crc kubenswrapper[4956]: I0314 09:42:51.752407 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:51 crc kubenswrapper[4956]: I0314 09:42:51.855345 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.006854 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-ceilometer-tls-certs\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.006963 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-run-httpd\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007014 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-combined-ca-bundle\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007070 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-config-data\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007103 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-log-httpd\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007186 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qbxf\" (UniqueName: \"kubernetes.io/projected/4ee019dc-4049-48be-b9a5-48bfd7d5087d-kube-api-access-5qbxf\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007264 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-scripts\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007339 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-sg-core-conf-yaml\") pod \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\" (UID: \"4ee019dc-4049-48be-b9a5-48bfd7d5087d\") " Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007548 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.007881 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.008222 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.008237 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ee019dc-4049-48be-b9a5-48bfd7d5087d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.024322 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-scripts" (OuterVolumeSpecName: "scripts") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.024462 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee019dc-4049-48be-b9a5-48bfd7d5087d-kube-api-access-5qbxf" (OuterVolumeSpecName: "kube-api-access-5qbxf") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "kube-api-access-5qbxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.049557 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.081157 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.103876 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-config-data" (OuterVolumeSpecName: "config-data") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.109879 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.109940 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.109957 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.109972 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qbxf\" (UniqueName: \"kubernetes.io/projected/4ee019dc-4049-48be-b9a5-48bfd7d5087d-kube-api-access-5qbxf\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.109985 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.123048 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ee019dc-4049-48be-b9a5-48bfd7d5087d" (UID: "4ee019dc-4049-48be-b9a5-48bfd7d5087d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.211276 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee019dc-4049-48be-b9a5-48bfd7d5087d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.295131 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4ee019dc-4049-48be-b9a5-48bfd7d5087d","Type":"ContainerDied","Data":"3f0d8ddd2a3a9f81bd3adafcb5415de5181db2147ba6d28cd8483b14cb91d0a1"} Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.295193 4956 scope.go:117] "RemoveContainer" containerID="8ea674bdc5e5ead930de580fd1719fe8702d827caca10b02bd5991162db898d4" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.295441 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.318415 4956 scope.go:117] "RemoveContainer" containerID="21564abf7068131a17259f9cd7f84410ce38d1d6c9be0976c17eeb22b5673b84" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.337097 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.344807 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.347948 4956 scope.go:117] "RemoveContainer" containerID="074554d7db856e2f5225e85249d9133cf1b722c215b515c276018f37ff8c96ba" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360124 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:42:52 crc kubenswrapper[4956]: E0314 09:42:52.360457 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-central-agent" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360472 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-central-agent" Mar 14 09:42:52 crc kubenswrapper[4956]: E0314 09:42:52.360503 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="proxy-httpd" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360513 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="proxy-httpd" Mar 14 09:42:52 crc kubenswrapper[4956]: E0314 09:42:52.360535 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-notification-agent" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360542 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-notification-agent" Mar 14 09:42:52 crc kubenswrapper[4956]: E0314 09:42:52.360568 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="sg-core" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360576 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="sg-core" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360751 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-central-agent" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360767 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="ceilometer-notification-agent" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360790 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="proxy-httpd" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.360814 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" containerName="sg-core" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.378107 4956 scope.go:117] "RemoveContainer" containerID="168bb444d67e401128b5f0b814b66adaa787fde231fc18eddfae2e159454fefb" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.384116 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.384296 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.387527 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.387793 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.389727 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518219 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-run-httpd\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518296 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518314 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-scripts\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518329 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518353 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-log-httpd\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518446 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-config-data\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518466 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5sgf\" (UniqueName: \"kubernetes.io/projected/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-kube-api-access-c5sgf\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.518506 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.620205 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.620538 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.620638 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-scripts\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.620746 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-log-httpd\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.620978 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-config-data\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.621096 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5sgf\" (UniqueName: \"kubernetes.io/projected/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-kube-api-access-c5sgf\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.621536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.622001 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-run-httpd\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.621270 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-log-httpd\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.622455 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-run-httpd\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.628235 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-scripts\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.628233 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.629590 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.630469 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-config-data\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.637730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.641579 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5sgf\" (UniqueName: \"kubernetes.io/projected/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-kube-api-access-c5sgf\") pod \"ceilometer-0\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.716465 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:52 crc kubenswrapper[4956]: I0314 09:42:52.982942 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:53 crc kubenswrapper[4956]: I0314 09:42:53.160539 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:42:53 crc kubenswrapper[4956]: W0314 09:42:53.167955 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281db90f_5ce5_44ed_a0dd_a83d23bc5b50.slice/crio-cdb1eb88aa9d897dde185c3233240e5c6ca4780edc5ab9c9815c995ebe8e8598 WatchSource:0}: Error finding container cdb1eb88aa9d897dde185c3233240e5c6ca4780edc5ab9c9815c995ebe8e8598: Status 404 returned error can't find the container with id cdb1eb88aa9d897dde185c3233240e5c6ca4780edc5ab9c9815c995ebe8e8598 Mar 14 09:42:53 crc kubenswrapper[4956]: I0314 09:42:53.219975 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee019dc-4049-48be-b9a5-48bfd7d5087d" path="/var/lib/kubelet/pods/4ee019dc-4049-48be-b9a5-48bfd7d5087d/volumes" Mar 14 09:42:53 crc kubenswrapper[4956]: I0314 09:42:53.306365 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerStarted","Data":"cdb1eb88aa9d897dde185c3233240e5c6ca4780edc5ab9c9815c995ebe8e8598"} Mar 14 09:42:54 crc kubenswrapper[4956]: I0314 09:42:54.160259 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:54 crc kubenswrapper[4956]: I0314 09:42:54.317376 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerStarted","Data":"62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee"} Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.326821 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerStarted","Data":"adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a"} Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.400799 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.423704 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.423750 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.423790 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.424394 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"782b831bfebf5d160529d876878bf15ad46c206265675567c1df2cedcbdb4339"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:42:55 crc kubenswrapper[4956]: I0314 09:42:55.424453 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://782b831bfebf5d160529d876878bf15ad46c206265675567c1df2cedcbdb4339" gracePeriod=600 Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.025994 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.055951 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.337877 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="782b831bfebf5d160529d876878bf15ad46c206265675567c1df2cedcbdb4339" exitCode=0 Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.337957 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"782b831bfebf5d160529d876878bf15ad46c206265675567c1df2cedcbdb4339"} Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.338014 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634"} Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.338039 4956 scope.go:117] "RemoveContainer" containerID="b0757eb421058859776ec55eb583de3c675650c58294e73808f9e5ce1ae0c37a" Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.340729 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerStarted","Data":"23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40"} Mar 14 09:42:56 crc kubenswrapper[4956]: I0314 09:42:56.951434 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.153054 4956 scope.go:117] "RemoveContainer" containerID="defe8aa91f6ee237c406e15df839c5268b357638d3922dc07b09178bffd879ae" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.187217 4956 scope.go:117] "RemoveContainer" containerID="97ba76b7f590e48964927a1c93ac0cc7d2147eeca7cd5c6a69dc38cdffe3b900" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.236695 4956 scope.go:117] "RemoveContainer" containerID="3fdf7b4e03c7e389dae3f007121417b911ea4f71fdad56e9fe34e31bbfd30e02" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.252533 4956 scope.go:117] "RemoveContainer" containerID="6dc77a797fb23dd64c5e353052ea68ad0723873ef34ecff129286644d01910db" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.303600 4956 scope.go:117] "RemoveContainer" containerID="4a31ef57030e278350ebab884ff409db661aab9cd8d25e426395902db606ac73" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.330297 4956 scope.go:117] "RemoveContainer" containerID="531ab46c4edb095fea3a136af494feba3e4ea88c168d6565d7a3e64600d61f72" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.388371 4956 scope.go:117] "RemoveContainer" containerID="ba8dda9e3ebef2cfdcecc1d852fe29f813806b422aeff407f88a4df3837d64e6" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.668223 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:57 crc kubenswrapper[4956]: I0314 09:42:57.696429 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:58 crc kubenswrapper[4956]: I0314 09:42:58.157827 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:58 crc kubenswrapper[4956]: I0314 09:42:58.375383 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerStarted","Data":"478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740"} Mar 14 09:42:58 crc kubenswrapper[4956]: I0314 09:42:58.375859 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:58 crc kubenswrapper[4956]: I0314 09:42:58.398277 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:42:58 crc kubenswrapper[4956]: I0314 09:42:58.404886 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.829342619 podStartE2EDuration="6.40486111s" podCreationTimestamp="2026-03-14 09:42:52 +0000 UTC" firstStartedPulling="2026-03-14 09:42:53.170639277 +0000 UTC m=+2778.683331545" lastFinishedPulling="2026-03-14 09:42:57.746157768 +0000 UTC m=+2783.258850036" observedRunningTime="2026-03-14 09:42:58.397905574 +0000 UTC m=+2783.910597862" watchObservedRunningTime="2026-03-14 09:42:58.40486111 +0000 UTC m=+2783.917553378" Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.347009 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.384799 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.549382 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.676703 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-l5tm2"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.682982 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-l5tm2"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.789922 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.790318 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="cinder-backup" containerID="cri-o://06194ba955e4cd5d8b04f2a331150de087ca37a02423781ab72e8e623f62d420" gracePeriod=30 Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.790905 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="probe" containerID="cri-o://ab1b972777088af64b581d52e000dcb4a97aa4f5768715cbfaaa39ba7fcb25b4" gracePeriod=30 Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.839750 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.840035 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="cinder-scheduler" containerID="cri-o://3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977" gracePeriod=30 Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.840168 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="probe" containerID="cri-o://6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7" gracePeriod=30 Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.866159 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.866583 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api-log" containerID="cri-o://8a8784e10b8f2c58186aede58c8d97b8877e2b9b542f46aab35dbf186364889a" gracePeriod=30 Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.866651 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api" containerID="cri-o://569c9fa74d528637f1f5a56e76415682e4de7000b1e42d582e04fd714e5dea15" gracePeriod=30 Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.884797 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder46a3-account-delete-4rngm"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.888613 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.899923 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder46a3-account-delete-4rngm"] Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.952454 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-operator-scripts\") pod \"cinder46a3-account-delete-4rngm\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:42:59 crc kubenswrapper[4956]: I0314 09:42:59.952569 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sdjj\" (UniqueName: \"kubernetes.io/projected/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-kube-api-access-9sdjj\") pod \"cinder46a3-account-delete-4rngm\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.054715 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-operator-scripts\") pod \"cinder46a3-account-delete-4rngm\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.055084 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sdjj\" (UniqueName: \"kubernetes.io/projected/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-kube-api-access-9sdjj\") pod \"cinder46a3-account-delete-4rngm\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.056293 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-operator-scripts\") pod \"cinder46a3-account-delete-4rngm\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.079185 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sdjj\" (UniqueName: \"kubernetes.io/projected/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-kube-api-access-9sdjj\") pod \"cinder46a3-account-delete-4rngm\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.211915 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.420195 4956 generic.go:334] "Generic (PLEG): container finished" podID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerID="8a8784e10b8f2c58186aede58c8d97b8877e2b9b542f46aab35dbf186364889a" exitCode=143 Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.420547 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"be8317c5-1327-4216-bf3f-400253bdfa3c","Type":"ContainerDied","Data":"8a8784e10b8f2c58186aede58c8d97b8877e2b9b542f46aab35dbf186364889a"} Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.423724 4956 generic.go:334] "Generic (PLEG): container finished" podID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerID="ab1b972777088af64b581d52e000dcb4a97aa4f5768715cbfaaa39ba7fcb25b4" exitCode=0 Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.424591 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"ee33bd5a-39fa-4066-8d18-06e1c12374b7","Type":"ContainerDied","Data":"ab1b972777088af64b581d52e000dcb4a97aa4f5768715cbfaaa39ba7fcb25b4"} Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.699573 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:43:00 crc kubenswrapper[4956]: I0314 09:43:00.704111 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder46a3-account-delete-4rngm"] Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.218251 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72dd9b23-3681-4a76-bc25-a2271e65d0aa" path="/var/lib/kubelet/pods/72dd9b23-3681-4a76-bc25-a2271e65d0aa/volumes" Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.446371 4956 generic.go:334] "Generic (PLEG): container finished" podID="8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" containerID="e70d565c754401c3d75812dd5dd7621be42ac1eed5a8cf608b9bb035994147b9" exitCode=0 Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.446447 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" event={"ID":"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47","Type":"ContainerDied","Data":"e70d565c754401c3d75812dd5dd7621be42ac1eed5a8cf608b9bb035994147b9"} Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.446473 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" event={"ID":"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47","Type":"ContainerStarted","Data":"62c271ec6194d78dc2f9b2618793cf08057e803128212dd4dab033543e9a3fe4"} Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.449517 4956 generic.go:334] "Generic (PLEG): container finished" podID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerID="6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7" exitCode=0 Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.449567 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"2954f359-0b87-4748-bd2f-f16d3d5121a4","Type":"ContainerDied","Data":"6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7"} Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.495073 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.495294 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="dfc67856-2753-47d2-8863-dec939316fd2" containerName="watcher-decision-engine" containerID="cri-o://2138362324a289a61f1abdb80a88d86ec50863ba3f271b4ed1dfdd791ac8b296" gracePeriod=30 Mar 14 09:43:01 crc kubenswrapper[4956]: I0314 09:43:01.845233 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.358206 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.358562 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-central-agent" containerID="cri-o://62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee" gracePeriod=30 Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.358714 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="proxy-httpd" containerID="cri-o://478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740" gracePeriod=30 Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.358775 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="sg-core" containerID="cri-o://23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40" gracePeriod=30 Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.358823 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-notification-agent" containerID="cri-o://adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a" gracePeriod=30 Mar 14 09:43:02 crc kubenswrapper[4956]: E0314 09:43:02.521925 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281db90f_5ce5_44ed_a0dd_a83d23bc5b50.slice/crio-conmon-23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281db90f_5ce5_44ed_a0dd_a83d23bc5b50.slice/crio-23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.828535 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.903234 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sdjj\" (UniqueName: \"kubernetes.io/projected/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-kube-api-access-9sdjj\") pod \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.903388 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-operator-scripts\") pod \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\" (UID: \"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47\") " Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.904549 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" (UID: "8b5002d8-2ce5-49ae-bb50-e3d61d98ee47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:43:02 crc kubenswrapper[4956]: I0314 09:43:02.911524 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-kube-api-access-9sdjj" (OuterVolumeSpecName: "kube-api-access-9sdjj") pod "8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" (UID: "8b5002d8-2ce5-49ae-bb50-e3d61d98ee47"). InnerVolumeSpecName "kube-api-access-9sdjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.004826 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.004859 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sdjj\" (UniqueName: \"kubernetes.io/projected/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47-kube-api-access-9sdjj\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.046510 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.293437 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/cinder-api-0" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.250:8776/healthcheck\": read tcp 10.217.0.2:40810->10.217.0.250:8776: read: connection reset by peer" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.465612 4956 generic.go:334] "Generic (PLEG): container finished" podID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerID="478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740" exitCode=0 Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.465651 4956 generic.go:334] "Generic (PLEG): container finished" podID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerID="23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40" exitCode=2 Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.465669 4956 generic.go:334] "Generic (PLEG): container finished" podID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerID="adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a" exitCode=0 Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.465642 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerDied","Data":"478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740"} Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.465741 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerDied","Data":"23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40"} Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.465755 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerDied","Data":"adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a"} Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.467217 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" event={"ID":"8b5002d8-2ce5-49ae-bb50-e3d61d98ee47","Type":"ContainerDied","Data":"62c271ec6194d78dc2f9b2618793cf08057e803128212dd4dab033543e9a3fe4"} Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.467250 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62c271ec6194d78dc2f9b2618793cf08057e803128212dd4dab033543e9a3fe4" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.467284 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder46a3-account-delete-4rngm" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.469596 4956 generic.go:334] "Generic (PLEG): container finished" podID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerID="569c9fa74d528637f1f5a56e76415682e4de7000b1e42d582e04fd714e5dea15" exitCode=0 Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.469626 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"be8317c5-1327-4216-bf3f-400253bdfa3c","Type":"ContainerDied","Data":"569c9fa74d528637f1f5a56e76415682e4de7000b1e42d582e04fd714e5dea15"} Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.773659 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819266 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be8317c5-1327-4216-bf3f-400253bdfa3c-logs\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819418 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-scripts\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819454 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819544 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dknw8\" (UniqueName: \"kubernetes.io/projected/be8317c5-1327-4216-bf3f-400253bdfa3c-kube-api-access-dknw8\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819572 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-internal-tls-certs\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819631 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-combined-ca-bundle\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819658 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-public-tls-certs\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819682 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-cert-memcached-mtls\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819720 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be8317c5-1327-4216-bf3f-400253bdfa3c-etc-machine-id\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819796 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data-custom\") pod \"be8317c5-1327-4216-bf3f-400253bdfa3c\" (UID: \"be8317c5-1327-4216-bf3f-400253bdfa3c\") " Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.819943 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be8317c5-1327-4216-bf3f-400253bdfa3c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.820261 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be8317c5-1327-4216-bf3f-400253bdfa3c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.820434 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be8317c5-1327-4216-bf3f-400253bdfa3c-logs" (OuterVolumeSpecName: "logs") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.826573 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be8317c5-1327-4216-bf3f-400253bdfa3c-kube-api-access-dknw8" (OuterVolumeSpecName: "kube-api-access-dknw8") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "kube-api-access-dknw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.834675 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-scripts" (OuterVolumeSpecName: "scripts") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.871075 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.889893 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.902404 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.915707 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data" (OuterVolumeSpecName: "config-data") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.919989 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.921982 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be8317c5-1327-4216-bf3f-400253bdfa3c-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922009 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922018 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922029 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dknw8\" (UniqueName: \"kubernetes.io/projected/be8317c5-1327-4216-bf3f-400253bdfa3c-kube-api-access-dknw8\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922039 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922048 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922056 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.922064 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:03 crc kubenswrapper[4956]: I0314 09:43:03.930442 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "be8317c5-1327-4216-bf3f-400253bdfa3c" (UID: "be8317c5-1327-4216-bf3f-400253bdfa3c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.023818 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/be8317c5-1327-4216-bf3f-400253bdfa3c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.279376 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.424867 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.484119 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"be8317c5-1327-4216-bf3f-400253bdfa3c","Type":"ContainerDied","Data":"e5a75031b3babd6966f7fe673ad19d4493825a58f18f7206dd63adc1a93730ce"} Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.484142 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.484171 4956 scope.go:117] "RemoveContainer" containerID="569c9fa74d528637f1f5a56e76415682e4de7000b1e42d582e04fd714e5dea15" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.488025 4956 generic.go:334] "Generic (PLEG): container finished" podID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerID="06194ba955e4cd5d8b04f2a331150de087ca37a02423781ab72e8e623f62d420" exitCode=0 Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.488106 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"ee33bd5a-39fa-4066-8d18-06e1c12374b7","Type":"ContainerDied","Data":"06194ba955e4cd5d8b04f2a331150de087ca37a02423781ab72e8e623f62d420"} Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.491192 4956 generic.go:334] "Generic (PLEG): container finished" podID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerID="3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977" exitCode=0 Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.491240 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"2954f359-0b87-4748-bd2f-f16d3d5121a4","Type":"ContainerDied","Data":"3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977"} Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.491272 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"2954f359-0b87-4748-bd2f-f16d3d5121a4","Type":"ContainerDied","Data":"74cb53279a160f340c28c86f6e609e1f7e379c7144c75648d336054ce41691fa"} Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.491337 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.530821 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-scripts\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.530912 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data-custom\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.530959 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-combined-ca-bundle\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.531048 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-cert-memcached-mtls\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.531126 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2954f359-0b87-4748-bd2f-f16d3d5121a4-etc-machine-id\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.531188 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xc8t\" (UniqueName: \"kubernetes.io/projected/2954f359-0b87-4748-bd2f-f16d3d5121a4-kube-api-access-6xc8t\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.531213 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data\") pod \"2954f359-0b87-4748-bd2f-f16d3d5121a4\" (UID: \"2954f359-0b87-4748-bd2f-f16d3d5121a4\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.531837 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2954f359-0b87-4748-bd2f-f16d3d5121a4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.534871 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-scripts" (OuterVolumeSpecName: "scripts") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.535093 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2954f359-0b87-4748-bd2f-f16d3d5121a4-kube-api-access-6xc8t" (OuterVolumeSpecName: "kube-api-access-6xc8t") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "kube-api-access-6xc8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.535327 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.586439 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.618114 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data" (OuterVolumeSpecName: "config-data") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.633018 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2954f359-0b87-4748-bd2f-f16d3d5121a4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.633049 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xc8t\" (UniqueName: \"kubernetes.io/projected/2954f359-0b87-4748-bd2f-f16d3d5121a4-kube-api-access-6xc8t\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.633061 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.633068 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.633077 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.633085 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.663045 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "2954f359-0b87-4748-bd2f-f16d3d5121a4" (UID: "2954f359-0b87-4748-bd2f-f16d3d5121a4"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.665377 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.669149 4956 scope.go:117] "RemoveContainer" containerID="8a8784e10b8f2c58186aede58c8d97b8877e2b9b542f46aab35dbf186364889a" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.686752 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.693184 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.701354 4956 scope.go:117] "RemoveContainer" containerID="6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.723583 4956 scope.go:117] "RemoveContainer" containerID="3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.733903 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-lib-modules\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734113 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-lib-cinder\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734196 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734233 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734320 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-nvme\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734407 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-dev\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734495 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-run\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734596 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-cert-memcached-mtls\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734663 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-machine-id\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734401 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734434 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-dev" (OuterVolumeSpecName: "dev") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734523 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-run" (OuterVolumeSpecName: "run") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734765 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734869 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.734950 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-brick\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735025 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-iscsi\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735111 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsctn\" (UniqueName: \"kubernetes.io/projected/ee33bd5a-39fa-4066-8d18-06e1c12374b7-kube-api-access-fsctn\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735159 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735177 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735260 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-combined-ca-bundle\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735346 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data-custom\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735416 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-scripts\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735507 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-sys\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.735596 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-cinder\") pod \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\" (UID: \"ee33bd5a-39fa-4066-8d18-06e1c12374b7\") " Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736018 4956 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736090 4956 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736144 4956 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736261 4956 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-dev\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736316 4956 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-run\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736367 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736418 4956 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736474 4956 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736561 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2954f359-0b87-4748-bd2f-f16d3d5121a4-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736643 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.736716 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-sys" (OuterVolumeSpecName: "sys") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.741415 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-scripts" (OuterVolumeSpecName: "scripts") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.741459 4956 scope.go:117] "RemoveContainer" containerID="6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.741562 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: E0314 09:43:04.742000 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7\": container with ID starting with 6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7 not found: ID does not exist" containerID="6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.742042 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7"} err="failed to get container status \"6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7\": rpc error: code = NotFound desc = could not find container \"6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7\": container with ID starting with 6babc6ae007119e64b96931c96439c5f0923ad12a5892ce6495b37108b53fbd7 not found: ID does not exist" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.742066 4956 scope.go:117] "RemoveContainer" containerID="3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.742167 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee33bd5a-39fa-4066-8d18-06e1c12374b7-kube-api-access-fsctn" (OuterVolumeSpecName: "kube-api-access-fsctn") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "kube-api-access-fsctn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: E0314 09:43:04.742840 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977\": container with ID starting with 3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977 not found: ID does not exist" containerID="3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.742873 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977"} err="failed to get container status \"3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977\": rpc error: code = NotFound desc = could not find container \"3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977\": container with ID starting with 3e9b5d27431a022ed94bd7f09a7ae594a2dc4ac0ec6621e81c9263743e575977 not found: ID does not exist" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.775217 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.835388 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.838076 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsctn\" (UniqueName: \"kubernetes.io/projected/ee33bd5a-39fa-4066-8d18-06e1c12374b7-kube-api-access-fsctn\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.838109 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.838118 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.838127 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.838137 4956 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-sys\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.838148 4956 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ee33bd5a-39fa-4066-8d18-06e1c12374b7-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.844412 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.853240 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-create-d7v5r"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.858567 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data" (OuterVolumeSpecName: "config-data") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.872193 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-create-d7v5r"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.874568 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ee33bd5a-39fa-4066-8d18-06e1c12374b7" (UID: "ee33bd5a-39fa-4066-8d18-06e1c12374b7"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.880207 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder46a3-account-delete-4rngm"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.888126 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder46a3-account-delete-4rngm"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.896491 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.904742 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-46a3-account-create-update-ln9q4"] Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.939285 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:04 crc kubenswrapper[4956]: I0314 09:43:04.939329 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee33bd5a-39fa-4066-8d18-06e1c12374b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.219287 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" path="/var/lib/kubelet/pods/2954f359-0b87-4748-bd2f-f16d3d5121a4/volumes" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.220314 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" path="/var/lib/kubelet/pods/8b5002d8-2ce5-49ae-bb50-e3d61d98ee47/volumes" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.221462 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0162871-a7ca-4c7b-8147-884d131abcd6" path="/var/lib/kubelet/pods/a0162871-a7ca-4c7b-8147-884d131abcd6/volumes" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.223202 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" path="/var/lib/kubelet/pods/be8317c5-1327-4216-bf3f-400253bdfa3c/volumes" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.224295 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28c8d67-468c-4083-ad8a-17fdfa500bff" path="/var/lib/kubelet/pods/c28c8d67-468c-4083-ad8a-17fdfa500bff/volumes" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.479811 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_dfc67856-2753-47d2-8863-dec939316fd2/watcher-decision-engine/0.log" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.511106 4956 generic.go:334] "Generic (PLEG): container finished" podID="dfc67856-2753-47d2-8863-dec939316fd2" containerID="2138362324a289a61f1abdb80a88d86ec50863ba3f271b4ed1dfdd791ac8b296" exitCode=0 Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.511433 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"dfc67856-2753-47d2-8863-dec939316fd2","Type":"ContainerDied","Data":"2138362324a289a61f1abdb80a88d86ec50863ba3f271b4ed1dfdd791ac8b296"} Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.537759 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"ee33bd5a-39fa-4066-8d18-06e1c12374b7","Type":"ContainerDied","Data":"4fcd7e33cfbaeb3912d96786bb02dd5d456cafa757c62fe556f3e211c0f5a827"} Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.537815 4956 scope.go:117] "RemoveContainer" containerID="ab1b972777088af64b581d52e000dcb4a97aa4f5768715cbfaaa39ba7fcb25b4" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.537847 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.577905 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.585749 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.605511 4956 scope.go:117] "RemoveContainer" containerID="06194ba955e4cd5d8b04f2a331150de087ca37a02423781ab72e8e623f62d420" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.816362 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.955389 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-config-data\") pod \"dfc67856-2753-47d2-8863-dec939316fd2\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.955444 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-cert-memcached-mtls\") pod \"dfc67856-2753-47d2-8863-dec939316fd2\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.955503 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbxn\" (UniqueName: \"kubernetes.io/projected/dfc67856-2753-47d2-8863-dec939316fd2-kube-api-access-vcbxn\") pod \"dfc67856-2753-47d2-8863-dec939316fd2\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.955567 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-combined-ca-bundle\") pod \"dfc67856-2753-47d2-8863-dec939316fd2\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.955641 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfc67856-2753-47d2-8863-dec939316fd2-logs\") pod \"dfc67856-2753-47d2-8863-dec939316fd2\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.955691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-custom-prometheus-ca\") pod \"dfc67856-2753-47d2-8863-dec939316fd2\" (UID: \"dfc67856-2753-47d2-8863-dec939316fd2\") " Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.956216 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc67856-2753-47d2-8863-dec939316fd2-logs" (OuterVolumeSpecName: "logs") pod "dfc67856-2753-47d2-8863-dec939316fd2" (UID: "dfc67856-2753-47d2-8863-dec939316fd2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.961984 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc67856-2753-47d2-8863-dec939316fd2-kube-api-access-vcbxn" (OuterVolumeSpecName: "kube-api-access-vcbxn") pod "dfc67856-2753-47d2-8863-dec939316fd2" (UID: "dfc67856-2753-47d2-8863-dec939316fd2"). InnerVolumeSpecName "kube-api-access-vcbxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.976435 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "dfc67856-2753-47d2-8863-dec939316fd2" (UID: "dfc67856-2753-47d2-8863-dec939316fd2"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.984086 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfc67856-2753-47d2-8863-dec939316fd2" (UID: "dfc67856-2753-47d2-8863-dec939316fd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:05 crc kubenswrapper[4956]: I0314 09:43:05.996187 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-config-data" (OuterVolumeSpecName: "config-data") pod "dfc67856-2753-47d2-8863-dec939316fd2" (UID: "dfc67856-2753-47d2-8863-dec939316fd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.026245 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "dfc67856-2753-47d2-8863-dec939316fd2" (UID: "dfc67856-2753-47d2-8863-dec939316fd2"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.057606 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbxn\" (UniqueName: \"kubernetes.io/projected/dfc67856-2753-47d2-8863-dec939316fd2-kube-api-access-vcbxn\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.057652 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.057669 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfc67856-2753-47d2-8863-dec939316fd2-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.057683 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.057695 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.057707 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/dfc67856-2753-47d2-8863-dec939316fd2-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.549505 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"dfc67856-2753-47d2-8863-dec939316fd2","Type":"ContainerDied","Data":"0a4be1389802e084cea09e5ca6d7dcd1024dc4312c3808befadf0614b8b485e5"} Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.549555 4956 scope.go:117] "RemoveContainer" containerID="2138362324a289a61f1abdb80a88d86ec50863ba3f271b4ed1dfdd791ac8b296" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.549556 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.583069 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.595884 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.607558 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608005 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="probe" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608031 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="probe" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608048 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" containerName="mariadb-account-delete" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608057 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" containerName="mariadb-account-delete" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608072 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="cinder-scheduler" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608080 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="cinder-scheduler" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608099 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="cinder-backup" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608108 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="cinder-backup" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608119 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc67856-2753-47d2-8863-dec939316fd2" containerName="watcher-decision-engine" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608127 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc67856-2753-47d2-8863-dec939316fd2" containerName="watcher-decision-engine" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608147 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api-log" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608155 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api-log" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608181 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608189 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api" Mar 14 09:43:06 crc kubenswrapper[4956]: E0314 09:43:06.608203 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="probe" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608211 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="probe" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608424 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608452 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc67856-2753-47d2-8863-dec939316fd2" containerName="watcher-decision-engine" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608461 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5002d8-2ce5-49ae-bb50-e3d61d98ee47" containerName="mariadb-account-delete" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608472 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="be8317c5-1327-4216-bf3f-400253bdfa3c" containerName="cinder-api-log" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608501 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="cinder-backup" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608512 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" containerName="probe" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608529 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="cinder-scheduler" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.608540 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2954f359-0b87-4748-bd2f-f16d3d5121a4" containerName="probe" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.609230 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.611113 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.615567 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.668168 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.668231 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.668353 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.668390 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.668606 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6546\" (UniqueName: \"kubernetes.io/projected/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-kube-api-access-c6546\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.668694 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770211 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770299 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770322 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770378 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6546\" (UniqueName: \"kubernetes.io/projected/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-kube-api-access-c6546\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770418 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.770938 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.774404 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.774466 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.775417 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.775557 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.787868 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6546\" (UniqueName: \"kubernetes.io/projected/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-kube-api-access-c6546\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:06 crc kubenswrapper[4956]: I0314 09:43:06.931307 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.219850 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc67856-2753-47d2-8863-dec939316fd2" path="/var/lib/kubelet/pods/dfc67856-2753-47d2-8863-dec939316fd2/volumes" Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.221044 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee33bd5a-39fa-4066-8d18-06e1c12374b7" path="/var/lib/kubelet/pods/ee33bd5a-39fa-4066-8d18-06e1c12374b7/volumes" Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.346386 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:07 crc kubenswrapper[4956]: W0314 09:43:07.351169 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4e5b2e4_bf54_404f_9a07_d13e9ec4f723.slice/crio-ea51d01128a40f60767c051cbc58b18b4190bedd1a08811c20a8b736cfbd38d7 WatchSource:0}: Error finding container ea51d01128a40f60767c051cbc58b18b4190bedd1a08811c20a8b736cfbd38d7: Status 404 returned error can't find the container with id ea51d01128a40f60767c051cbc58b18b4190bedd1a08811c20a8b736cfbd38d7 Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.560553 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723","Type":"ContainerStarted","Data":"0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd"} Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.560911 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723","Type":"ContainerStarted","Data":"ea51d01128a40f60767c051cbc58b18b4190bedd1a08811c20a8b736cfbd38d7"} Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.579774 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.579756434 podStartE2EDuration="1.579756434s" podCreationTimestamp="2026-03-14 09:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:07.573858425 +0000 UTC m=+2793.086550693" watchObservedRunningTime="2026-03-14 09:43:07.579756434 +0000 UTC m=+2793.092448702" Mar 14 09:43:07 crc kubenswrapper[4956]: I0314 09:43:07.757411 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:08 crc kubenswrapper[4956]: I0314 09:43:08.899919 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:10 crc kubenswrapper[4956]: I0314 09:43:10.042985 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:11 crc kubenswrapper[4956]: I0314 09:43:11.225580 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:12 crc kubenswrapper[4956]: I0314 09:43:12.447112 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:13 crc kubenswrapper[4956]: I0314 09:43:13.625216 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:14 crc kubenswrapper[4956]: I0314 09:43:14.823527 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:16 crc kubenswrapper[4956]: I0314 09:43:16.024747 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:16 crc kubenswrapper[4956]: I0314 09:43:16.932361 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:16 crc kubenswrapper[4956]: I0314 09:43:16.970972 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:17 crc kubenswrapper[4956]: I0314 09:43:17.188203 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:17 crc kubenswrapper[4956]: I0314 09:43:17.640915 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:17 crc kubenswrapper[4956]: I0314 09:43:17.676070 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.369768 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/watcher-decision-engine/0.log" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.493410 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.501934 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-sjwtp"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.538246 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher2dda-account-delete-j4rt2"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.539598 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.548979 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2dda-account-delete-j4rt2"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.559734 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.559960 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="06eb83ec-f4cc-4b4d-a129-cb029c188810" containerName="watcher-applier" containerID="cri-o://79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9" gracePeriod=30 Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.613425 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.630451 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.630709 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-kuttl-api-log" containerID="cri-o://8cdc4ce98055306fde55b3a21b37398de8997ab5d2c0790dd8a28464255a27cb" gracePeriod=30 Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.630871 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-api" containerID="cri-o://7e5cf6b205184cc09c2b7019a3e9b09c1b956a7ceaf1e9c02ce1f5e4b0c5b4ab" gracePeriod=30 Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.654164 4956 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-tz552\" not found" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.666991 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtdmq\" (UniqueName: \"kubernetes.io/projected/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-kube-api-access-jtdmq\") pod \"watcher2dda-account-delete-j4rt2\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.667346 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-operator-scripts\") pod \"watcher2dda-account-delete-j4rt2\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.768248 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtdmq\" (UniqueName: \"kubernetes.io/projected/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-kube-api-access-jtdmq\") pod \"watcher2dda-account-delete-j4rt2\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.768340 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-operator-scripts\") pod \"watcher2dda-account-delete-j4rt2\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: E0314 09:43:18.769367 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:18 crc kubenswrapper[4956]: E0314 09:43:18.769413 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data podName:b4e5b2e4-bf54-404f-9a07-d13e9ec4f723 nodeName:}" failed. No retries permitted until 2026-03-14 09:43:19.269399504 +0000 UTC m=+2804.782091762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.769843 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-operator-scripts\") pod \"watcher2dda-account-delete-j4rt2\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.788393 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtdmq\" (UniqueName: \"kubernetes.io/projected/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-kube-api-access-jtdmq\") pod \"watcher2dda-account-delete-j4rt2\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:18 crc kubenswrapper[4956]: I0314 09:43:18.874767 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.229701 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6" path="/var/lib/kubelet/pods/0ab8d230-dc20-4f35-bb22-fe0bb9c3a0e6/volumes" Mar 14 09:43:19 crc kubenswrapper[4956]: E0314 09:43:19.279462 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:19 crc kubenswrapper[4956]: E0314 09:43:19.279592 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data podName:b4e5b2e4-bf54-404f-9a07-d13e9ec4f723 nodeName:}" failed. No retries permitted until 2026-03-14 09:43:20.279577497 +0000 UTC m=+2805.792269765 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.332964 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2dda-account-delete-j4rt2"] Mar 14 09:43:19 crc kubenswrapper[4956]: W0314 09:43:19.337320 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e52dda_c821_42ea_a7a7_85d6f1c4dab1.slice/crio-da07026eb1cfe9fdfcfbe7feeae6920d7cf69c0de6986740aab41f762501da55 WatchSource:0}: Error finding container da07026eb1cfe9fdfcfbe7feeae6920d7cf69c0de6986740aab41f762501da55: Status 404 returned error can't find the container with id da07026eb1cfe9fdfcfbe7feeae6920d7cf69c0de6986740aab41f762501da55 Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.562370 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.580501 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-cert-memcached-mtls\") pod \"06eb83ec-f4cc-4b4d-a129-cb029c188810\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.580544 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06eb83ec-f4cc-4b4d-a129-cb029c188810-logs\") pod \"06eb83ec-f4cc-4b4d-a129-cb029c188810\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.580619 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82btc\" (UniqueName: \"kubernetes.io/projected/06eb83ec-f4cc-4b4d-a129-cb029c188810-kube-api-access-82btc\") pod \"06eb83ec-f4cc-4b4d-a129-cb029c188810\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.580635 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-combined-ca-bundle\") pod \"06eb83ec-f4cc-4b4d-a129-cb029c188810\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.580676 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-config-data\") pod \"06eb83ec-f4cc-4b4d-a129-cb029c188810\" (UID: \"06eb83ec-f4cc-4b4d-a129-cb029c188810\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.582322 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06eb83ec-f4cc-4b4d-a129-cb029c188810-logs" (OuterVolumeSpecName: "logs") pod "06eb83ec-f4cc-4b4d-a129-cb029c188810" (UID: "06eb83ec-f4cc-4b4d-a129-cb029c188810"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.591594 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06eb83ec-f4cc-4b4d-a129-cb029c188810-kube-api-access-82btc" (OuterVolumeSpecName: "kube-api-access-82btc") pod "06eb83ec-f4cc-4b4d-a129-cb029c188810" (UID: "06eb83ec-f4cc-4b4d-a129-cb029c188810"). InnerVolumeSpecName "kube-api-access-82btc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.608146 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06eb83ec-f4cc-4b4d-a129-cb029c188810" (UID: "06eb83ec-f4cc-4b4d-a129-cb029c188810"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.648035 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-config-data" (OuterVolumeSpecName: "config-data") pod "06eb83ec-f4cc-4b4d-a129-cb029c188810" (UID: "06eb83ec-f4cc-4b4d-a129-cb029c188810"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.650568 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "06eb83ec-f4cc-4b4d-a129-cb029c188810" (UID: "06eb83ec-f4cc-4b4d-a129-cb029c188810"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682011 4956 generic.go:334] "Generic (PLEG): container finished" podID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerID="7e5cf6b205184cc09c2b7019a3e9b09c1b956a7ceaf1e9c02ce1f5e4b0c5b4ab" exitCode=0 Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682042 4956 generic.go:334] "Generic (PLEG): container finished" podID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerID="8cdc4ce98055306fde55b3a21b37398de8997ab5d2c0790dd8a28464255a27cb" exitCode=143 Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682079 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80","Type":"ContainerDied","Data":"7e5cf6b205184cc09c2b7019a3e9b09c1b956a7ceaf1e9c02ce1f5e4b0c5b4ab"} Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682105 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80","Type":"ContainerDied","Data":"8cdc4ce98055306fde55b3a21b37398de8997ab5d2c0790dd8a28464255a27cb"} Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682526 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82btc\" (UniqueName: \"kubernetes.io/projected/06eb83ec-f4cc-4b4d-a129-cb029c188810-kube-api-access-82btc\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682552 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682560 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682569 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06eb83ec-f4cc-4b4d-a129-cb029c188810-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.682579 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06eb83ec-f4cc-4b4d-a129-cb029c188810-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.686544 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" event={"ID":"31e52dda-c821-42ea-a7a7-85d6f1c4dab1","Type":"ContainerStarted","Data":"ac5afc3dbeb9d9b56af193314acd72c0aa0769368fc184897822a199824d8369"} Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.686571 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" event={"ID":"31e52dda-c821-42ea-a7a7-85d6f1c4dab1","Type":"ContainerStarted","Data":"da07026eb1cfe9fdfcfbe7feeae6920d7cf69c0de6986740aab41f762501da55"} Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.689738 4956 generic.go:334] "Generic (PLEG): container finished" podID="06eb83ec-f4cc-4b4d-a129-cb029c188810" containerID="79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9" exitCode=0 Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.689796 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.689834 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"06eb83ec-f4cc-4b4d-a129-cb029c188810","Type":"ContainerDied","Data":"79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9"} Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.689864 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"06eb83ec-f4cc-4b4d-a129-cb029c188810","Type":"ContainerDied","Data":"0de3c8dc7cb59adbf114800057a183b0638cdda238607ed451eeb08da58ddf91"} Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.689884 4956 scope.go:117] "RemoveContainer" containerID="79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.691121 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" containerName="watcher-decision-engine" containerID="cri-o://0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd" gracePeriod=30 Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.713720 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" podStartSLOduration=1.713696225 podStartE2EDuration="1.713696225s" podCreationTimestamp="2026-03-14 09:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:19.713064709 +0000 UTC m=+2805.225756977" watchObservedRunningTime="2026-03-14 09:43:19.713696225 +0000 UTC m=+2805.226388493" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.721089 4956 scope.go:117] "RemoveContainer" containerID="79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9" Mar 14 09:43:19 crc kubenswrapper[4956]: E0314 09:43:19.721518 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9\": container with ID starting with 79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9 not found: ID does not exist" containerID="79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.721548 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9"} err="failed to get container status \"79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9\": rpc error: code = NotFound desc = could not find container \"79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9\": container with ID starting with 79e969033a432d36fda60588046b667991384b93b2b83cc0270dcc12ba55ddc9 not found: ID does not exist" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.742698 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.758552 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.905863 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.987295 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-config-data\") pod \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.987370 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-cert-memcached-mtls\") pod \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.987397 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-custom-prometheus-ca\") pod \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.987455 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stmk5\" (UniqueName: \"kubernetes.io/projected/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-kube-api-access-stmk5\") pod \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.987510 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-logs\") pod \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.987546 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-combined-ca-bundle\") pod \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\" (UID: \"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80\") " Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.990073 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-logs" (OuterVolumeSpecName: "logs") pod "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" (UID: "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:19 crc kubenswrapper[4956]: I0314 09:43:19.993602 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-kube-api-access-stmk5" (OuterVolumeSpecName: "kube-api-access-stmk5") pod "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" (UID: "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80"). InnerVolumeSpecName "kube-api-access-stmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.038523 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" (UID: "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.045541 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" (UID: "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.070373 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-config-data" (OuterVolumeSpecName: "config-data") pod "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" (UID: "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.071397 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" (UID: "37b8cbae-06d6-4ce8-b3c0-ab05743ebb80"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.089393 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stmk5\" (UniqueName: \"kubernetes.io/projected/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-kube-api-access-stmk5\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.089432 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.089445 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.089456 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.089468 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.089494 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:20 crc kubenswrapper[4956]: E0314 09:43:20.294709 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:20 crc kubenswrapper[4956]: E0314 09:43:20.295663 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data podName:b4e5b2e4-bf54-404f-9a07-d13e9ec4f723 nodeName:}" failed. No retries permitted until 2026-03-14 09:43:22.295634484 +0000 UTC m=+2807.808326752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.698921 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"37b8cbae-06d6-4ce8-b3c0-ab05743ebb80","Type":"ContainerDied","Data":"e00a45429abb13a296fe7d50e78121c18642046878c14c76746c16ec44db501b"} Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.698969 4956 scope.go:117] "RemoveContainer" containerID="7e5cf6b205184cc09c2b7019a3e9b09c1b956a7ceaf1e9c02ce1f5e4b0c5b4ab" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.699050 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.712779 4956 generic.go:334] "Generic (PLEG): container finished" podID="31e52dda-c821-42ea-a7a7-85d6f1c4dab1" containerID="ac5afc3dbeb9d9b56af193314acd72c0aa0769368fc184897822a199824d8369" exitCode=0 Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.712859 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" event={"ID":"31e52dda-c821-42ea-a7a7-85d6f1c4dab1","Type":"ContainerDied","Data":"ac5afc3dbeb9d9b56af193314acd72c0aa0769368fc184897822a199824d8369"} Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.734061 4956 scope.go:117] "RemoveContainer" containerID="8cdc4ce98055306fde55b3a21b37398de8997ab5d2c0790dd8a28464255a27cb" Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.758287 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:43:20 crc kubenswrapper[4956]: I0314 09:43:20.767740 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:43:21 crc kubenswrapper[4956]: I0314 09:43:21.219135 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06eb83ec-f4cc-4b4d-a129-cb029c188810" path="/var/lib/kubelet/pods/06eb83ec-f4cc-4b4d-a129-cb029c188810/volumes" Mar 14 09:43:21 crc kubenswrapper[4956]: I0314 09:43:21.220117 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" path="/var/lib/kubelet/pods/37b8cbae-06d6-4ce8-b3c0-ab05743ebb80/volumes" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.080875 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.222371 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-operator-scripts\") pod \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.222470 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtdmq\" (UniqueName: \"kubernetes.io/projected/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-kube-api-access-jtdmq\") pod \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\" (UID: \"31e52dda-c821-42ea-a7a7-85d6f1c4dab1\") " Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.222859 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31e52dda-c821-42ea-a7a7-85d6f1c4dab1" (UID: "31e52dda-c821-42ea-a7a7-85d6f1c4dab1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.223951 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.227403 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-kube-api-access-jtdmq" (OuterVolumeSpecName: "kube-api-access-jtdmq") pod "31e52dda-c821-42ea-a7a7-85d6f1c4dab1" (UID: "31e52dda-c821-42ea-a7a7-85d6f1c4dab1"). InnerVolumeSpecName "kube-api-access-jtdmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.325321 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtdmq\" (UniqueName: \"kubernetes.io/projected/31e52dda-c821-42ea-a7a7-85d6f1c4dab1-kube-api-access-jtdmq\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:22 crc kubenswrapper[4956]: E0314 09:43:22.325426 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:22 crc kubenswrapper[4956]: E0314 09:43:22.325507 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data podName:b4e5b2e4-bf54-404f-9a07-d13e9ec4f723 nodeName:}" failed. No retries permitted until 2026-03-14 09:43:26.325490081 +0000 UTC m=+2811.838182349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.719428 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.254:3000/\": dial tcp 10.217.0.254:3000: connect: connection refused" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.743170 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" event={"ID":"31e52dda-c821-42ea-a7a7-85d6f1c4dab1","Type":"ContainerDied","Data":"da07026eb1cfe9fdfcfbe7feeae6920d7cf69c0de6986740aab41f762501da55"} Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.743231 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da07026eb1cfe9fdfcfbe7feeae6920d7cf69c0de6986740aab41f762501da55" Mar 14 09:43:22 crc kubenswrapper[4956]: I0314 09:43:22.743252 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2dda-account-delete-j4rt2" Mar 14 09:43:23 crc kubenswrapper[4956]: I0314 09:43:23.565805 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-j8fn2"] Mar 14 09:43:23 crc kubenswrapper[4956]: I0314 09:43:23.573852 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-j8fn2"] Mar 14 09:43:23 crc kubenswrapper[4956]: I0314 09:43:23.581266 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher2dda-account-delete-j4rt2"] Mar 14 09:43:23 crc kubenswrapper[4956]: I0314 09:43:23.590379 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher2dda-account-delete-j4rt2"] Mar 14 09:43:23 crc kubenswrapper[4956]: I0314 09:43:23.599291 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb"] Mar 14 09:43:23 crc kubenswrapper[4956]: I0314 09:43:23.606610 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-2dda-account-create-update-mn8kb"] Mar 14 09:43:25 crc kubenswrapper[4956]: I0314 09:43:25.223584 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e52dda-c821-42ea-a7a7-85d6f1c4dab1" path="/var/lib/kubelet/pods/31e52dda-c821-42ea-a7a7-85d6f1c4dab1/volumes" Mar 14 09:43:25 crc kubenswrapper[4956]: I0314 09:43:25.224206 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b2bab5-aa6d-4da2-a689-8908125bebff" path="/var/lib/kubelet/pods/65b2bab5-aa6d-4da2-a689-8908125bebff/volumes" Mar 14 09:43:25 crc kubenswrapper[4956]: I0314 09:43:25.224845 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16fe9b4-f1f7-416c-a9c7-2996cddc6a29" path="/var/lib/kubelet/pods/d16fe9b4-f1f7-416c-a9c7-2996cddc6a29/volumes" Mar 14 09:43:26 crc kubenswrapper[4956]: E0314 09:43:26.385248 4956 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:26 crc kubenswrapper[4956]: E0314 09:43:26.385536 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data podName:b4e5b2e4-bf54-404f-9a07-d13e9ec4f723 nodeName:}" failed. No retries permitted until 2026-03-14 09:43:34.385520986 +0000 UTC m=+2819.898213254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723") : secret "watcher-kuttl-decision-engine-config-data" not found Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.334627 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531178 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data\") pod \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531228 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-combined-ca-bundle\") pod \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531259 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6546\" (UniqueName: \"kubernetes.io/projected/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-kube-api-access-c6546\") pod \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531303 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-logs\") pod \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531395 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-custom-prometheus-ca\") pod \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531420 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-cert-memcached-mtls\") pod \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\" (UID: \"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723\") " Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.531978 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-logs" (OuterVolumeSpecName: "logs") pod "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.541717 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-kube-api-access-c6546" (OuterVolumeSpecName: "kube-api-access-c6546") pod "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723"). InnerVolumeSpecName "kube-api-access-c6546". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.554378 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.558760 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.577589 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data" (OuterVolumeSpecName: "config-data") pod "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.593776 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" (UID: "b4e5b2e4-bf54-404f-9a07-d13e9ec4f723"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.632844 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.632873 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.632884 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.632893 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.632905 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6546\" (UniqueName: \"kubernetes.io/projected/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-kube-api-access-c6546\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.632914 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.802211 4956 generic.go:334] "Generic (PLEG): container finished" podID="b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" containerID="0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd" exitCode=0 Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.802264 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723","Type":"ContainerDied","Data":"0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd"} Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.802299 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"b4e5b2e4-bf54-404f-9a07-d13e9ec4f723","Type":"ContainerDied","Data":"ea51d01128a40f60767c051cbc58b18b4190bedd1a08811c20a8b736cfbd38d7"} Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.802322 4956 scope.go:117] "RemoveContainer" containerID="0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.802396 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.823672 4956 scope.go:117] "RemoveContainer" containerID="0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd" Mar 14 09:43:29 crc kubenswrapper[4956]: E0314 09:43:29.824101 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd\": container with ID starting with 0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd not found: ID does not exist" containerID="0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.824138 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd"} err="failed to get container status \"0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd\": rpc error: code = NotFound desc = could not find container \"0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd\": container with ID starting with 0a92862e43cd5fba63471c5ce410462133b425726fb5c75e10b29766831f5ddd not found: ID does not exist" Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.836986 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:29 crc kubenswrapper[4956]: I0314 09:43:29.851537 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.950998 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-ngj29"] Mar 14 09:43:30 crc kubenswrapper[4956]: E0314 09:43:30.951904 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06eb83ec-f4cc-4b4d-a129-cb029c188810" containerName="watcher-applier" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.951923 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="06eb83ec-f4cc-4b4d-a129-cb029c188810" containerName="watcher-applier" Mar 14 09:43:30 crc kubenswrapper[4956]: E0314 09:43:30.951958 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e52dda-c821-42ea-a7a7-85d6f1c4dab1" containerName="mariadb-account-delete" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.951966 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e52dda-c821-42ea-a7a7-85d6f1c4dab1" containerName="mariadb-account-delete" Mar 14 09:43:30 crc kubenswrapper[4956]: E0314 09:43:30.951975 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-api" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.951983 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-api" Mar 14 09:43:30 crc kubenswrapper[4956]: E0314 09:43:30.951997 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" containerName="watcher-decision-engine" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952046 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" containerName="watcher-decision-engine" Mar 14 09:43:30 crc kubenswrapper[4956]: E0314 09:43:30.952072 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-kuttl-api-log" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952080 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-kuttl-api-log" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952267 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" containerName="watcher-decision-engine" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952280 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e52dda-c821-42ea-a7a7-85d6f1c4dab1" containerName="mariadb-account-delete" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952297 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-api" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952306 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="06eb83ec-f4cc-4b4d-a129-cb029c188810" containerName="watcher-applier" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.952319 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b8cbae-06d6-4ce8-b3c0-ab05743ebb80" containerName="watcher-kuttl-api-log" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.953061 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.960630 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-6a49-account-create-update-82rm9"] Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.961698 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.963256 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.971975 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ngj29"] Mar 14 09:43:30 crc kubenswrapper[4956]: I0314 09:43:30.992521 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-6a49-account-create-update-82rm9"] Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.053005 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-operator-scripts\") pod \"watcher-6a49-account-create-update-82rm9\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.053045 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f685c4cb-19e7-4b54-a594-29001444b634-operator-scripts\") pod \"watcher-db-create-ngj29\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.053104 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9gw7\" (UniqueName: \"kubernetes.io/projected/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-kube-api-access-j9gw7\") pod \"watcher-6a49-account-create-update-82rm9\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.053219 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6db64\" (UniqueName: \"kubernetes.io/projected/f685c4cb-19e7-4b54-a594-29001444b634-kube-api-access-6db64\") pod \"watcher-db-create-ngj29\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.154358 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6db64\" (UniqueName: \"kubernetes.io/projected/f685c4cb-19e7-4b54-a594-29001444b634-kube-api-access-6db64\") pod \"watcher-db-create-ngj29\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.154426 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-operator-scripts\") pod \"watcher-6a49-account-create-update-82rm9\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.154447 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f685c4cb-19e7-4b54-a594-29001444b634-operator-scripts\") pod \"watcher-db-create-ngj29\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.154475 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9gw7\" (UniqueName: \"kubernetes.io/projected/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-kube-api-access-j9gw7\") pod \"watcher-6a49-account-create-update-82rm9\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.155180 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-operator-scripts\") pod \"watcher-6a49-account-create-update-82rm9\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.155192 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f685c4cb-19e7-4b54-a594-29001444b634-operator-scripts\") pod \"watcher-db-create-ngj29\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.174034 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9gw7\" (UniqueName: \"kubernetes.io/projected/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-kube-api-access-j9gw7\") pod \"watcher-6a49-account-create-update-82rm9\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.180003 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6db64\" (UniqueName: \"kubernetes.io/projected/f685c4cb-19e7-4b54-a594-29001444b634-kube-api-access-6db64\") pod \"watcher-db-create-ngj29\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.218196 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e5b2e4-bf54-404f-9a07-d13e9ec4f723" path="/var/lib/kubelet/pods/b4e5b2e4-bf54-404f-9a07-d13e9ec4f723/volumes" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.274578 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.283024 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.756354 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ngj29"] Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.827962 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-6a49-account-create-update-82rm9"] Mar 14 09:43:31 crc kubenswrapper[4956]: I0314 09:43:31.834403 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-ngj29" event={"ID":"f685c4cb-19e7-4b54-a594-29001444b634","Type":"ContainerStarted","Data":"36d49069e30501d5405d50468fb74a6f1968152aad10bf610dc2c73a08be1764"} Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.760961 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.784512 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-config-data\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.784578 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-ceilometer-tls-certs\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.785140 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.785244 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-run-httpd\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.785283 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-sg-core-conf-yaml\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.785323 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-combined-ca-bundle\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.786145 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-scripts\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.786170 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-log-httpd\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.786193 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5sgf\" (UniqueName: \"kubernetes.io/projected/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-kube-api-access-c5sgf\") pod \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\" (UID: \"281db90f-5ce5-44ed-a0dd-a83d23bc5b50\") " Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.786631 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.787656 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.789989 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-scripts" (OuterVolumeSpecName: "scripts") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.790235 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-kube-api-access-c5sgf" (OuterVolumeSpecName: "kube-api-access-c5sgf") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "kube-api-access-c5sgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.814255 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.838335 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.843430 4956 generic.go:334] "Generic (PLEG): container finished" podID="1d71f38a-1f62-4195-8b0f-33ea24cd04b4" containerID="f5ab6b2e2e353ab2544b81611534a1925b470fab2198f521f67b7d6423d4136f" exitCode=0 Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.843513 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" event={"ID":"1d71f38a-1f62-4195-8b0f-33ea24cd04b4","Type":"ContainerDied","Data":"f5ab6b2e2e353ab2544b81611534a1925b470fab2198f521f67b7d6423d4136f"} Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.843551 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" event={"ID":"1d71f38a-1f62-4195-8b0f-33ea24cd04b4","Type":"ContainerStarted","Data":"3292b0a88518a29b0f4d73451a0ed81d8d3647003cac3f36466bd2c522ccff3f"} Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.853136 4956 generic.go:334] "Generic (PLEG): container finished" podID="f685c4cb-19e7-4b54-a594-29001444b634" containerID="f59c119113a1d8209718cbfedd24007c7228978ce5151769e01394556d2d139c" exitCode=0 Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.853372 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-ngj29" event={"ID":"f685c4cb-19e7-4b54-a594-29001444b634","Type":"ContainerDied","Data":"f59c119113a1d8209718cbfedd24007c7228978ce5151769e01394556d2d139c"} Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.858123 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.858251 4956 generic.go:334] "Generic (PLEG): container finished" podID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerID="62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee" exitCode=137 Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.858329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerDied","Data":"62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee"} Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.858342 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.858367 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"281db90f-5ce5-44ed-a0dd-a83d23bc5b50","Type":"ContainerDied","Data":"cdb1eb88aa9d897dde185c3233240e5c6ca4780edc5ab9c9815c995ebe8e8598"} Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.858414 4956 scope.go:117] "RemoveContainer" containerID="478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.873019 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-config-data" (OuterVolumeSpecName: "config-data") pod "281db90f-5ce5-44ed-a0dd-a83d23bc5b50" (UID: "281db90f-5ce5-44ed-a0dd-a83d23bc5b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888052 4956 scope.go:117] "RemoveContainer" containerID="23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888536 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888560 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888571 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888580 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888590 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5sgf\" (UniqueName: \"kubernetes.io/projected/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-kube-api-access-c5sgf\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888598 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.888606 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/281db90f-5ce5-44ed-a0dd-a83d23bc5b50-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.905665 4956 scope.go:117] "RemoveContainer" containerID="adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.922647 4956 scope.go:117] "RemoveContainer" containerID="62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.940260 4956 scope.go:117] "RemoveContainer" containerID="478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740" Mar 14 09:43:32 crc kubenswrapper[4956]: E0314 09:43:32.940675 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740\": container with ID starting with 478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740 not found: ID does not exist" containerID="478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.940720 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740"} err="failed to get container status \"478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740\": rpc error: code = NotFound desc = could not find container \"478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740\": container with ID starting with 478833326be177f61cb55a847227c95e9d3e3214eca78b1e442ebc3fd3677740 not found: ID does not exist" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.940744 4956 scope.go:117] "RemoveContainer" containerID="23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40" Mar 14 09:43:32 crc kubenswrapper[4956]: E0314 09:43:32.941042 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40\": container with ID starting with 23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40 not found: ID does not exist" containerID="23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.941070 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40"} err="failed to get container status \"23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40\": rpc error: code = NotFound desc = could not find container \"23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40\": container with ID starting with 23622070ab174a2e93feac67b231e79553d7fa509e9e789984f9cb23be6b9b40 not found: ID does not exist" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.941085 4956 scope.go:117] "RemoveContainer" containerID="adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a" Mar 14 09:43:32 crc kubenswrapper[4956]: E0314 09:43:32.941300 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a\": container with ID starting with adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a not found: ID does not exist" containerID="adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.941331 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a"} err="failed to get container status \"adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a\": rpc error: code = NotFound desc = could not find container \"adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a\": container with ID starting with adb68344a7eb7c5349ee595266f9285bcaf506b06bae3d6e825ff606af62da3a not found: ID does not exist" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.941347 4956 scope.go:117] "RemoveContainer" containerID="62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee" Mar 14 09:43:32 crc kubenswrapper[4956]: E0314 09:43:32.941677 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee\": container with ID starting with 62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee not found: ID does not exist" containerID="62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee" Mar 14 09:43:32 crc kubenswrapper[4956]: I0314 09:43:32.941707 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee"} err="failed to get container status \"62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee\": rpc error: code = NotFound desc = could not find container \"62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee\": container with ID starting with 62cf6fdd55006c5e954e55b390c9d3d6a5e8a8d5fd2508b523003953fe4dfeee not found: ID does not exist" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.189824 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.200438 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.218288 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" path="/var/lib/kubelet/pods/281db90f-5ce5-44ed-a0dd-a83d23bc5b50/volumes" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219008 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:33 crc kubenswrapper[4956]: E0314 09:43:33.219248 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="sg-core" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219262 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="sg-core" Mar 14 09:43:33 crc kubenswrapper[4956]: E0314 09:43:33.219278 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-notification-agent" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219284 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-notification-agent" Mar 14 09:43:33 crc kubenswrapper[4956]: E0314 09:43:33.219311 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-central-agent" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219318 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-central-agent" Mar 14 09:43:33 crc kubenswrapper[4956]: E0314 09:43:33.219327 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="proxy-httpd" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219333 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="proxy-httpd" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219464 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-central-agent" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219478 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="ceilometer-notification-agent" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219508 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="proxy-httpd" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.219519 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="281db90f-5ce5-44ed-a0dd-a83d23bc5b50" containerName="sg-core" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.222955 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.225241 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.225410 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.225771 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.226509 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.294128 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-run-httpd\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.294419 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.294526 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb7nq\" (UniqueName: \"kubernetes.io/projected/bfff884b-f18d-4e79-8ad9-deb71e90aea5-kube-api-access-pb7nq\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.294696 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-scripts\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.294824 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.295607 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.295900 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-config-data\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.296016 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-log-httpd\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.398046 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-scripts\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.398336 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.398876 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.398995 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-config-data\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.399148 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-log-httpd\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.399221 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-run-httpd\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.399303 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.399372 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb7nq\" (UniqueName: \"kubernetes.io/projected/bfff884b-f18d-4e79-8ad9-deb71e90aea5-kube-api-access-pb7nq\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.399749 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-log-httpd\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.399839 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-run-httpd\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.403564 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.403624 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.404097 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-scripts\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.404660 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-config-data\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.405050 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.430629 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb7nq\" (UniqueName: \"kubernetes.io/projected/bfff884b-f18d-4e79-8ad9-deb71e90aea5-kube-api-access-pb7nq\") pod \"ceilometer-0\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:33 crc kubenswrapper[4956]: I0314 09:43:33.586135 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:34 crc kubenswrapper[4956]: W0314 09:43:34.036130 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfff884b_f18d_4e79_8ad9_deb71e90aea5.slice/crio-9a5e22fcad2856db3f5a9d7e0d397afc5467ce9bd0e8c68c8efb623d0d02a299 WatchSource:0}: Error finding container 9a5e22fcad2856db3f5a9d7e0d397afc5467ce9bd0e8c68c8efb623d0d02a299: Status 404 returned error can't find the container with id 9a5e22fcad2856db3f5a9d7e0d397afc5467ce9bd0e8c68c8efb623d0d02a299 Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.036170 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.166538 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.211878 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6db64\" (UniqueName: \"kubernetes.io/projected/f685c4cb-19e7-4b54-a594-29001444b634-kube-api-access-6db64\") pod \"f685c4cb-19e7-4b54-a594-29001444b634\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.211918 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f685c4cb-19e7-4b54-a594-29001444b634-operator-scripts\") pod \"f685c4cb-19e7-4b54-a594-29001444b634\" (UID: \"f685c4cb-19e7-4b54-a594-29001444b634\") " Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.212756 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f685c4cb-19e7-4b54-a594-29001444b634-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f685c4cb-19e7-4b54-a594-29001444b634" (UID: "f685c4cb-19e7-4b54-a594-29001444b634"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.216284 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f685c4cb-19e7-4b54-a594-29001444b634-kube-api-access-6db64" (OuterVolumeSpecName: "kube-api-access-6db64") pod "f685c4cb-19e7-4b54-a594-29001444b634" (UID: "f685c4cb-19e7-4b54-a594-29001444b634"). InnerVolumeSpecName "kube-api-access-6db64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.313186 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6db64\" (UniqueName: \"kubernetes.io/projected/f685c4cb-19e7-4b54-a594-29001444b634-kube-api-access-6db64\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.313213 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f685c4cb-19e7-4b54-a594-29001444b634-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.315871 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.414782 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-operator-scripts\") pod \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.414884 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9gw7\" (UniqueName: \"kubernetes.io/projected/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-kube-api-access-j9gw7\") pod \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\" (UID: \"1d71f38a-1f62-4195-8b0f-33ea24cd04b4\") " Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.415263 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d71f38a-1f62-4195-8b0f-33ea24cd04b4" (UID: "1d71f38a-1f62-4195-8b0f-33ea24cd04b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.415444 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.425689 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-kube-api-access-j9gw7" (OuterVolumeSpecName: "kube-api-access-j9gw7") pod "1d71f38a-1f62-4195-8b0f-33ea24cd04b4" (UID: "1d71f38a-1f62-4195-8b0f-33ea24cd04b4"). InnerVolumeSpecName "kube-api-access-j9gw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.516407 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9gw7\" (UniqueName: \"kubernetes.io/projected/1d71f38a-1f62-4195-8b0f-33ea24cd04b4-kube-api-access-j9gw7\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.876851 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.876853 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-6a49-account-create-update-82rm9" event={"ID":"1d71f38a-1f62-4195-8b0f-33ea24cd04b4","Type":"ContainerDied","Data":"3292b0a88518a29b0f4d73451a0ed81d8d3647003cac3f36466bd2c522ccff3f"} Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.877352 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3292b0a88518a29b0f4d73451a0ed81d8d3647003cac3f36466bd2c522ccff3f" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.878215 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerStarted","Data":"f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa"} Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.878260 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerStarted","Data":"9a5e22fcad2856db3f5a9d7e0d397afc5467ce9bd0e8c68c8efb623d0d02a299"} Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.879325 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-ngj29" event={"ID":"f685c4cb-19e7-4b54-a594-29001444b634","Type":"ContainerDied","Data":"36d49069e30501d5405d50468fb74a6f1968152aad10bf610dc2c73a08be1764"} Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.879355 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36d49069e30501d5405d50468fb74a6f1968152aad10bf610dc2c73a08be1764" Mar 14 09:43:34 crc kubenswrapper[4956]: I0314 09:43:34.879432 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-ngj29" Mar 14 09:43:35 crc kubenswrapper[4956]: I0314 09:43:35.890282 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerStarted","Data":"f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c"} Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.204777 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-xrs27"] Mar 14 09:43:36 crc kubenswrapper[4956]: E0314 09:43:36.205116 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f685c4cb-19e7-4b54-a594-29001444b634" containerName="mariadb-database-create" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.205131 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f685c4cb-19e7-4b54-a594-29001444b634" containerName="mariadb-database-create" Mar 14 09:43:36 crc kubenswrapper[4956]: E0314 09:43:36.205147 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d71f38a-1f62-4195-8b0f-33ea24cd04b4" containerName="mariadb-account-create-update" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.205153 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d71f38a-1f62-4195-8b0f-33ea24cd04b4" containerName="mariadb-account-create-update" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.205294 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f685c4cb-19e7-4b54-a594-29001444b634" containerName="mariadb-database-create" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.205312 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d71f38a-1f62-4195-8b0f-33ea24cd04b4" containerName="mariadb-account-create-update" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.205832 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.215070 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.215410 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sdzfx" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.218897 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-xrs27"] Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.241989 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.243112 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-db-sync-config-data\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.243384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfmk6\" (UniqueName: \"kubernetes.io/projected/e6b2e2cb-91b3-498b-978c-18fb76d846b4-kube-api-access-wfmk6\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.243425 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-config-data\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.344341 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-db-sync-config-data\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.344393 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfmk6\" (UniqueName: \"kubernetes.io/projected/e6b2e2cb-91b3-498b-978c-18fb76d846b4-kube-api-access-wfmk6\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.344412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-config-data\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.344450 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.350413 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-db-sync-config-data\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.350619 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-config-data\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.356218 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.361037 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfmk6\" (UniqueName: \"kubernetes.io/projected/e6b2e2cb-91b3-498b-978c-18fb76d846b4-kube-api-access-wfmk6\") pod \"watcher-kuttl-db-sync-xrs27\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.553453 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:36 crc kubenswrapper[4956]: I0314 09:43:36.899661 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerStarted","Data":"e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4"} Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.037659 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-xrs27"] Mar 14 09:43:37 crc kubenswrapper[4956]: W0314 09:43:37.039238 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b2e2cb_91b3_498b_978c_18fb76d846b4.slice/crio-32d1670db4db908458c7aeb9c411a7348dd113f4fe01eabb3ebccc9f4debc919 WatchSource:0}: Error finding container 32d1670db4db908458c7aeb9c411a7348dd113f4fe01eabb3ebccc9f4debc919: Status 404 returned error can't find the container with id 32d1670db4db908458c7aeb9c411a7348dd113f4fe01eabb3ebccc9f4debc919 Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.909463 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerStarted","Data":"4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f"} Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.909873 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.910768 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" event={"ID":"e6b2e2cb-91b3-498b-978c-18fb76d846b4","Type":"ContainerStarted","Data":"48a5745b1aa44ba60cd3b7447229aa0b367cd12eac97674caff8f45d9c14a52a"} Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.910805 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" event={"ID":"e6b2e2cb-91b3-498b-978c-18fb76d846b4","Type":"ContainerStarted","Data":"32d1670db4db908458c7aeb9c411a7348dd113f4fe01eabb3ebccc9f4debc919"} Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.935014 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.3934204829999999 podStartE2EDuration="4.934996294s" podCreationTimestamp="2026-03-14 09:43:33 +0000 UTC" firstStartedPulling="2026-03-14 09:43:34.043132377 +0000 UTC m=+2819.555824645" lastFinishedPulling="2026-03-14 09:43:37.584708188 +0000 UTC m=+2823.097400456" observedRunningTime="2026-03-14 09:43:37.928406117 +0000 UTC m=+2823.441098385" watchObservedRunningTime="2026-03-14 09:43:37.934996294 +0000 UTC m=+2823.447688562" Mar 14 09:43:37 crc kubenswrapper[4956]: I0314 09:43:37.949295 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" podStartSLOduration=1.949275685 podStartE2EDuration="1.949275685s" podCreationTimestamp="2026-03-14 09:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:37.942030342 +0000 UTC m=+2823.454722610" watchObservedRunningTime="2026-03-14 09:43:37.949275685 +0000 UTC m=+2823.461967953" Mar 14 09:43:39 crc kubenswrapper[4956]: I0314 09:43:39.926747 4956 generic.go:334] "Generic (PLEG): container finished" podID="e6b2e2cb-91b3-498b-978c-18fb76d846b4" containerID="48a5745b1aa44ba60cd3b7447229aa0b367cd12eac97674caff8f45d9c14a52a" exitCode=0 Mar 14 09:43:39 crc kubenswrapper[4956]: I0314 09:43:39.926837 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" event={"ID":"e6b2e2cb-91b3-498b-978c-18fb76d846b4","Type":"ContainerDied","Data":"48a5745b1aa44ba60cd3b7447229aa0b367cd12eac97674caff8f45d9c14a52a"} Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.274224 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.371310 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfmk6\" (UniqueName: \"kubernetes.io/projected/e6b2e2cb-91b3-498b-978c-18fb76d846b4-kube-api-access-wfmk6\") pod \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.371402 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-combined-ca-bundle\") pod \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.371499 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-db-sync-config-data\") pod \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.371524 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-config-data\") pod \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\" (UID: \"e6b2e2cb-91b3-498b-978c-18fb76d846b4\") " Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.383375 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e6b2e2cb-91b3-498b-978c-18fb76d846b4" (UID: "e6b2e2cb-91b3-498b-978c-18fb76d846b4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.383378 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b2e2cb-91b3-498b-978c-18fb76d846b4-kube-api-access-wfmk6" (OuterVolumeSpecName: "kube-api-access-wfmk6") pod "e6b2e2cb-91b3-498b-978c-18fb76d846b4" (UID: "e6b2e2cb-91b3-498b-978c-18fb76d846b4"). InnerVolumeSpecName "kube-api-access-wfmk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.397414 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b2e2cb-91b3-498b-978c-18fb76d846b4" (UID: "e6b2e2cb-91b3-498b-978c-18fb76d846b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.410975 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-config-data" (OuterVolumeSpecName: "config-data") pod "e6b2e2cb-91b3-498b-978c-18fb76d846b4" (UID: "e6b2e2cb-91b3-498b-978c-18fb76d846b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.474951 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfmk6\" (UniqueName: \"kubernetes.io/projected/e6b2e2cb-91b3-498b-978c-18fb76d846b4-kube-api-access-wfmk6\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.474985 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.474997 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.475008 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b2e2cb-91b3-498b-978c-18fb76d846b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.949910 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" event={"ID":"e6b2e2cb-91b3-498b-978c-18fb76d846b4","Type":"ContainerDied","Data":"32d1670db4db908458c7aeb9c411a7348dd113f4fe01eabb3ebccc9f4debc919"} Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.949951 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32d1670db4db908458c7aeb9c411a7348dd113f4fe01eabb3ebccc9f4debc919" Mar 14 09:43:41 crc kubenswrapper[4956]: I0314 09:43:41.950005 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-xrs27" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.558585 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:43:42 crc kubenswrapper[4956]: E0314 09:43:42.558884 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b2e2cb-91b3-498b-978c-18fb76d846b4" containerName="watcher-kuttl-db-sync" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.558896 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b2e2cb-91b3-498b-978c-18fb76d846b4" containerName="watcher-kuttl-db-sync" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.559053 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b2e2cb-91b3-498b-978c-18fb76d846b4" containerName="watcher-kuttl-db-sync" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.560069 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.562917 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sdzfx" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.563120 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.573142 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.574388 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.587439 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.594038 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.663218 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.664628 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.668255 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.682824 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.684916 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.686640 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693378 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693587 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/ff8672e3-e8f7-4738-9925-9945e4db581e-kube-api-access-bl4j4\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693631 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693686 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693729 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8672e3-e8f7-4738-9925-9945e4db581e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693756 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-logs\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693782 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693815 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693845 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693876 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4497\" (UniqueName: \"kubernetes.io/projected/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-kube-api-access-w4497\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.693908 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.697823 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.715777 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.795175 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.795263 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.795899 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk9rd\" (UniqueName: \"kubernetes.io/projected/2decb267-0305-4e3e-b505-cc929640d7a8-kube-api-access-jk9rd\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.795974 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/ff8672e3-e8f7-4738-9925-9945e4db581e-kube-api-access-bl4j4\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796112 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796316 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92102f18-f0db-441b-a147-a8fab678a887-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796393 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796460 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w7d6\" (UniqueName: \"kubernetes.io/projected/92102f18-f0db-441b-a147-a8fab678a887-kube-api-access-7w7d6\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796515 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2decb267-0305-4e3e-b505-cc929640d7a8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796569 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8672e3-e8f7-4738-9925-9945e4db581e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796618 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-logs\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796660 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796689 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796731 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796812 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4497\" (UniqueName: \"kubernetes.io/projected/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-kube-api-access-w4497\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796848 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796883 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796905 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796933 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796975 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.796998 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.797023 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.804361 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.805597 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8672e3-e8f7-4738-9925-9945e4db581e-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.805966 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-logs\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.817853 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.819100 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.819178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.819275 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.819805 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.820509 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.822229 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.825774 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4497\" (UniqueName: \"kubernetes.io/projected/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-kube-api-access-w4497\") pod \"watcher-kuttl-api-1\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.827172 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/ff8672e3-e8f7-4738-9925-9945e4db581e-kube-api-access-bl4j4\") pod \"watcher-kuttl-api-0\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.885240 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.892436 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.898460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.898572 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.898599 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899109 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899220 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899258 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk9rd\" (UniqueName: \"kubernetes.io/projected/2decb267-0305-4e3e-b505-cc929640d7a8-kube-api-access-jk9rd\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92102f18-f0db-441b-a147-a8fab678a887-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899813 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w7d6\" (UniqueName: \"kubernetes.io/projected/92102f18-f0db-441b-a147-a8fab678a887-kube-api-access-7w7d6\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899844 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2decb267-0305-4e3e-b505-cc929640d7a8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.899891 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.900139 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92102f18-f0db-441b-a147-a8fab678a887-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.900885 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2decb267-0305-4e3e-b505-cc929640d7a8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.902612 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.903317 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.904104 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.905203 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.909553 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.910060 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.911077 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.922295 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w7d6\" (UniqueName: \"kubernetes.io/projected/92102f18-f0db-441b-a147-a8fab678a887-kube-api-access-7w7d6\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.924092 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk9rd\" (UniqueName: \"kubernetes.io/projected/2decb267-0305-4e3e-b505-cc929640d7a8-kube-api-access-jk9rd\") pod \"watcher-kuttl-applier-0\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:42 crc kubenswrapper[4956]: I0314 09:43:42.990995 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.005263 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.372852 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.463280 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.535067 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.541419 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:43:43 crc kubenswrapper[4956]: W0314 09:43:43.546257 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92102f18_f0db_441b_a147_a8fab678a887.slice/crio-eb6ed7a3472ac5ca6854d9bc1bbe1f0fb6a8ef1bf7613d00c9449503650056b8 WatchSource:0}: Error finding container eb6ed7a3472ac5ca6854d9bc1bbe1f0fb6a8ef1bf7613d00c9449503650056b8: Status 404 returned error can't find the container with id eb6ed7a3472ac5ca6854d9bc1bbe1f0fb6a8ef1bf7613d00c9449503650056b8 Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.972150 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff8672e3-e8f7-4738-9925-9945e4db581e","Type":"ContainerStarted","Data":"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.972530 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.972555 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff8672e3-e8f7-4738-9925-9945e4db581e","Type":"ContainerStarted","Data":"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.972568 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff8672e3-e8f7-4738-9925-9945e4db581e","Type":"ContainerStarted","Data":"21b654a5e616109b65d50b0cad1cf5ab15d41040e9fc629a7c02d510ca0040d1"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.974585 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.1.6:9322/\": dial tcp 10.217.1.6:9322: connect: connection refused" Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.975487 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"92102f18-f0db-441b-a147-a8fab678a887","Type":"ContainerStarted","Data":"20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.975519 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"92102f18-f0db-441b-a147-a8fab678a887","Type":"ContainerStarted","Data":"eb6ed7a3472ac5ca6854d9bc1bbe1f0fb6a8ef1bf7613d00c9449503650056b8"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.977508 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b5f047b6-f285-411b-9dcc-1883a2bbb1b0","Type":"ContainerStarted","Data":"d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.977545 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b5f047b6-f285-411b-9dcc-1883a2bbb1b0","Type":"ContainerStarted","Data":"1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.977559 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b5f047b6-f285-411b-9dcc-1883a2bbb1b0","Type":"ContainerStarted","Data":"c1e704b69da4ed696efe2f2c12e41263c5f239d6debb0238a172f77b1cffc6e0"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.977687 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.979329 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.1.7:9322/\": dial tcp 10.217.1.7:9322: connect: connection refused" Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.979743 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2decb267-0305-4e3e-b505-cc929640d7a8","Type":"ContainerStarted","Data":"ed722effbccaaad2fa2c1831f9438c159b8fb25f2e5da4988970b64f58197ea3"} Mar 14 09:43:43 crc kubenswrapper[4956]: I0314 09:43:43.979782 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2decb267-0305-4e3e-b505-cc929640d7a8","Type":"ContainerStarted","Data":"8ad938bfe9b705cd10a49e38a8207ac74791c502ea8e227ab81ee69ab22162c7"} Mar 14 09:43:44 crc kubenswrapper[4956]: I0314 09:43:44.034312 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.034291432 podStartE2EDuration="2.034291432s" podCreationTimestamp="2026-03-14 09:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:44.027334796 +0000 UTC m=+2829.540027074" watchObservedRunningTime="2026-03-14 09:43:44.034291432 +0000 UTC m=+2829.546983700" Mar 14 09:43:44 crc kubenswrapper[4956]: I0314 09:43:44.077452 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.077435054 podStartE2EDuration="2.077435054s" podCreationTimestamp="2026-03-14 09:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:44.072335245 +0000 UTC m=+2829.585027513" watchObservedRunningTime="2026-03-14 09:43:44.077435054 +0000 UTC m=+2829.590127322" Mar 14 09:43:44 crc kubenswrapper[4956]: I0314 09:43:44.102965 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=2.10294792 podStartE2EDuration="2.10294792s" podCreationTimestamp="2026-03-14 09:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:44.093371527 +0000 UTC m=+2829.606063805" watchObservedRunningTime="2026-03-14 09:43:44.10294792 +0000 UTC m=+2829.615640188" Mar 14 09:43:44 crc kubenswrapper[4956]: I0314 09:43:44.119136 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.119119119 podStartE2EDuration="2.119119119s" podCreationTimestamp="2026-03-14 09:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:43:44.116125793 +0000 UTC m=+2829.628818071" watchObservedRunningTime="2026-03-14 09:43:44.119119119 +0000 UTC m=+2829.631811387" Mar 14 09:43:47 crc kubenswrapper[4956]: I0314 09:43:47.198137 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:47 crc kubenswrapper[4956]: I0314 09:43:47.277211 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:47 crc kubenswrapper[4956]: I0314 09:43:47.885964 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:47 crc kubenswrapper[4956]: I0314 09:43:47.893236 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:48 crc kubenswrapper[4956]: I0314 09:43:48.005950 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:52 crc kubenswrapper[4956]: I0314 09:43:52.886564 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:52 crc kubenswrapper[4956]: I0314 09:43:52.891429 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:52 crc kubenswrapper[4956]: I0314 09:43:52.893534 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:52 crc kubenswrapper[4956]: I0314 09:43:52.898039 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:52 crc kubenswrapper[4956]: I0314 09:43:52.991298 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.006865 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.018499 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.029413 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.052029 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.062880 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.062947 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.078051 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:43:53 crc kubenswrapper[4956]: I0314 09:43:53.080033 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:43:54 crc kubenswrapper[4956]: I0314 09:43:54.534665 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:54 crc kubenswrapper[4956]: I0314 09:43:54.535557 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-central-agent" containerID="cri-o://f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa" gracePeriod=30 Mar 14 09:43:54 crc kubenswrapper[4956]: I0314 09:43:54.535666 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="sg-core" containerID="cri-o://e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4" gracePeriod=30 Mar 14 09:43:54 crc kubenswrapper[4956]: I0314 09:43:54.535823 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="proxy-httpd" containerID="cri-o://4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f" gracePeriod=30 Mar 14 09:43:54 crc kubenswrapper[4956]: I0314 09:43:54.535888 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-notification-agent" containerID="cri-o://f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c" gracePeriod=30 Mar 14 09:43:54 crc kubenswrapper[4956]: I0314 09:43:54.562639 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:55 crc kubenswrapper[4956]: I0314 09:43:55.071703 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerID="4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f" exitCode=0 Mar 14 09:43:55 crc kubenswrapper[4956]: I0314 09:43:55.071738 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerID="e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4" exitCode=2 Mar 14 09:43:55 crc kubenswrapper[4956]: I0314 09:43:55.071749 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerID="f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa" exitCode=0 Mar 14 09:43:55 crc kubenswrapper[4956]: I0314 09:43:55.072803 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerDied","Data":"4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f"} Mar 14 09:43:55 crc kubenswrapper[4956]: I0314 09:43:55.073088 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerDied","Data":"e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4"} Mar 14 09:43:55 crc kubenswrapper[4956]: I0314 09:43:55.073103 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerDied","Data":"f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa"} Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.808445 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944312 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-config-data\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944385 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb7nq\" (UniqueName: \"kubernetes.io/projected/bfff884b-f18d-4e79-8ad9-deb71e90aea5-kube-api-access-pb7nq\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944446 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-run-httpd\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944469 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-sg-core-conf-yaml\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944532 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-ceilometer-tls-certs\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944570 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-log-httpd\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944637 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-scripts\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.944717 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-combined-ca-bundle\") pod \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\" (UID: \"bfff884b-f18d-4e79-8ad9-deb71e90aea5\") " Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.946272 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.946792 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.952043 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-scripts" (OuterVolumeSpecName: "scripts") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.952353 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfff884b-f18d-4e79-8ad9-deb71e90aea5-kube-api-access-pb7nq" (OuterVolumeSpecName: "kube-api-access-pb7nq") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "kube-api-access-pb7nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:43:56 crc kubenswrapper[4956]: I0314 09:43:56.987628 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.002115 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.038527 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046734 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046780 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb7nq\" (UniqueName: \"kubernetes.io/projected/bfff884b-f18d-4e79-8ad9-deb71e90aea5-kube-api-access-pb7nq\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046796 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046809 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046820 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046829 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfff884b-f18d-4e79-8ad9-deb71e90aea5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.046838 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.065051 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-config-data" (OuterVolumeSpecName: "config-data") pod "bfff884b-f18d-4e79-8ad9-deb71e90aea5" (UID: "bfff884b-f18d-4e79-8ad9-deb71e90aea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.089155 4956 generic.go:334] "Generic (PLEG): container finished" podID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerID="f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c" exitCode=0 Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.089196 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerDied","Data":"f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c"} Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.089243 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"bfff884b-f18d-4e79-8ad9-deb71e90aea5","Type":"ContainerDied","Data":"9a5e22fcad2856db3f5a9d7e0d397afc5467ce9bd0e8c68c8efb623d0d02a299"} Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.089278 4956 scope.go:117] "RemoveContainer" containerID="4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.089410 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.107269 4956 scope.go:117] "RemoveContainer" containerID="e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.126613 4956 scope.go:117] "RemoveContainer" containerID="f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.130892 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.146714 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.148896 4956 scope.go:117] "RemoveContainer" containerID="f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.150285 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfff884b-f18d-4e79-8ad9-deb71e90aea5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.161882 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.162233 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-notification-agent" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162249 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-notification-agent" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.162266 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="proxy-httpd" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162273 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="proxy-httpd" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.162291 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="sg-core" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162303 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="sg-core" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.162314 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-central-agent" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162321 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-central-agent" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162507 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="sg-core" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162522 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-notification-agent" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162530 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="proxy-httpd" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.162540 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" containerName="ceilometer-central-agent" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.164010 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.166988 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.167002 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.167025 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.168595 4956 scope.go:117] "RemoveContainer" containerID="4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.169047 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f\": container with ID starting with 4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f not found: ID does not exist" containerID="4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.169095 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f"} err="failed to get container status \"4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f\": rpc error: code = NotFound desc = could not find container \"4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f\": container with ID starting with 4a5cf4f73c2a05c300a4359fb092cac11e29d60d6761e3a3fc4119f1d62fb88f not found: ID does not exist" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.169129 4956 scope.go:117] "RemoveContainer" containerID="e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.169523 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4\": container with ID starting with e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4 not found: ID does not exist" containerID="e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.169549 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4"} err="failed to get container status \"e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4\": rpc error: code = NotFound desc = could not find container \"e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4\": container with ID starting with e314abb28bd7144d13f4500a8f1b370831aad2af94953c43d75d0d28198a88d4 not found: ID does not exist" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.169568 4956 scope.go:117] "RemoveContainer" containerID="f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.169974 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c\": container with ID starting with f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c not found: ID does not exist" containerID="f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.169993 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c"} err="failed to get container status \"f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c\": rpc error: code = NotFound desc = could not find container \"f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c\": container with ID starting with f23a35059a3ed34d8ab3b7e94b9e68f119671352533fca93416d07a0f6c84c5c not found: ID does not exist" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.170006 4956 scope.go:117] "RemoveContainer" containerID="f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa" Mar 14 09:43:57 crc kubenswrapper[4956]: E0314 09:43:57.170265 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa\": container with ID starting with f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa not found: ID does not exist" containerID="f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.170304 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa"} err="failed to get container status \"f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa\": rpc error: code = NotFound desc = could not find container \"f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa\": container with ID starting with f4a426e293d7985c27707e89f217900a5ed10247f3ec50f7c9e7fa1369d9e2fa not found: ID does not exist" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.173546 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.219622 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfff884b-f18d-4e79-8ad9-deb71e90aea5" path="/var/lib/kubelet/pods/bfff884b-f18d-4e79-8ad9-deb71e90aea5/volumes" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.251903 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9k4r\" (UniqueName: \"kubernetes.io/projected/7dc3d756-2060-4570-a9e8-693e99a932b9-kube-api-access-s9k4r\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.251983 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.252014 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.252044 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.252072 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-scripts\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.252781 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.252932 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-config-data\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.253134 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354438 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354511 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-scripts\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354664 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-config-data\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354723 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354820 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9k4r\" (UniqueName: \"kubernetes.io/projected/7dc3d756-2060-4570-a9e8-693e99a932b9-kube-api-access-s9k4r\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.354992 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.355615 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.359019 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.359142 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-scripts\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.359891 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.363816 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.364643 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-config-data\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.382098 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9k4r\" (UniqueName: \"kubernetes.io/projected/7dc3d756-2060-4570-a9e8-693e99a932b9-kube-api-access-s9k4r\") pod \"ceilometer-0\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.484665 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.615552 4956 scope.go:117] "RemoveContainer" containerID="85c78f73d76851ad60e66f3dd7318eee356254f09a7c201a5201d306d3f4c103" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.635388 4956 scope.go:117] "RemoveContainer" containerID="04bb138cd4fd52e8e318184b642c4ab3a690080f8780675cd6f1ee1361022b4e" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.658244 4956 scope.go:117] "RemoveContainer" containerID="11b83dcd3b5441b2e4b7e4572e10a3322a09d12bdc5edb40a889ee5ba430e389" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.731770 4956 scope.go:117] "RemoveContainer" containerID="8a6c82a88d310589e2c13639d8725404c296e45576e0941be14e1263b40dffcc" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.752140 4956 scope.go:117] "RemoveContainer" containerID="043fece4492dd96929a18f91a3ed30ef7e9495e0ce541358bed56d7bf4f9ced0" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.772862 4956 scope.go:117] "RemoveContainer" containerID="8526826613a3132376d5ee31843e1927f8a239198f8f22204857ac298b732c9b" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.788300 4956 scope.go:117] "RemoveContainer" containerID="6f4983b6892f340840caef852dc183bea3837594e5c71f519e087cd83fe3da32" Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.920081 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:43:57 crc kubenswrapper[4956]: W0314 09:43:57.928300 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc3d756_2060_4570_a9e8_693e99a932b9.slice/crio-46a43b0a98a342203345f2fd7655551626a5ef60a737700df5aef9160bf8adb2 WatchSource:0}: Error finding container 46a43b0a98a342203345f2fd7655551626a5ef60a737700df5aef9160bf8adb2: Status 404 returned error can't find the container with id 46a43b0a98a342203345f2fd7655551626a5ef60a737700df5aef9160bf8adb2 Mar 14 09:43:57 crc kubenswrapper[4956]: I0314 09:43:57.940130 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:43:58 crc kubenswrapper[4956]: I0314 09:43:58.100055 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerStarted","Data":"46a43b0a98a342203345f2fd7655551626a5ef60a737700df5aef9160bf8adb2"} Mar 14 09:43:59 crc kubenswrapper[4956]: I0314 09:43:59.118399 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerStarted","Data":"a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868"} Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.127991 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerStarted","Data":"bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57"} Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.128635 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerStarted","Data":"89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c"} Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.139253 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558024-slxr8"] Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.141108 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.153963 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.154112 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.154161 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.156572 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-slxr8"] Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.206627 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnjpl\" (UniqueName: \"kubernetes.io/projected/876cec90-17ca-4a4c-8cbc-462048c32b1c-kube-api-access-rnjpl\") pod \"auto-csr-approver-29558024-slxr8\" (UID: \"876cec90-17ca-4a4c-8cbc-462048c32b1c\") " pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.308237 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnjpl\" (UniqueName: \"kubernetes.io/projected/876cec90-17ca-4a4c-8cbc-462048c32b1c-kube-api-access-rnjpl\") pod \"auto-csr-approver-29558024-slxr8\" (UID: \"876cec90-17ca-4a4c-8cbc-462048c32b1c\") " pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.325076 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnjpl\" (UniqueName: \"kubernetes.io/projected/876cec90-17ca-4a4c-8cbc-462048c32b1c-kube-api-access-rnjpl\") pod \"auto-csr-approver-29558024-slxr8\" (UID: \"876cec90-17ca-4a4c-8cbc-462048c32b1c\") " pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.471594 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:00 crc kubenswrapper[4956]: I0314 09:44:00.955148 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-slxr8"] Mar 14 09:44:00 crc kubenswrapper[4956]: W0314 09:44:00.978924 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod876cec90_17ca_4a4c_8cbc_462048c32b1c.slice/crio-4c2710921640f374dd6bfdb61318d13bb0f90f7f4c706c215b69db4e629f1c41 WatchSource:0}: Error finding container 4c2710921640f374dd6bfdb61318d13bb0f90f7f4c706c215b69db4e629f1c41: Status 404 returned error can't find the container with id 4c2710921640f374dd6bfdb61318d13bb0f90f7f4c706c215b69db4e629f1c41 Mar 14 09:44:01 crc kubenswrapper[4956]: I0314 09:44:01.136129 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-slxr8" event={"ID":"876cec90-17ca-4a4c-8cbc-462048c32b1c","Type":"ContainerStarted","Data":"4c2710921640f374dd6bfdb61318d13bb0f90f7f4c706c215b69db4e629f1c41"} Mar 14 09:44:02 crc kubenswrapper[4956]: I0314 09:44:02.145293 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-slxr8" event={"ID":"876cec90-17ca-4a4c-8cbc-462048c32b1c","Type":"ContainerStarted","Data":"41b3a68e9782fa6a945a3081a4b8ec44473c1c14b30979c5d702fddcd8ba6913"} Mar 14 09:44:02 crc kubenswrapper[4956]: I0314 09:44:02.149121 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerStarted","Data":"7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2"} Mar 14 09:44:02 crc kubenswrapper[4956]: I0314 09:44:02.149294 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:02 crc kubenswrapper[4956]: I0314 09:44:02.171741 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558024-slxr8" podStartSLOduration=1.3943686610000001 podStartE2EDuration="2.171718766s" podCreationTimestamp="2026-03-14 09:44:00 +0000 UTC" firstStartedPulling="2026-03-14 09:44:00.981499031 +0000 UTC m=+2846.494191299" lastFinishedPulling="2026-03-14 09:44:01.758849146 +0000 UTC m=+2847.271541404" observedRunningTime="2026-03-14 09:44:02.162510373 +0000 UTC m=+2847.675202651" watchObservedRunningTime="2026-03-14 09:44:02.171718766 +0000 UTC m=+2847.684411034" Mar 14 09:44:02 crc kubenswrapper[4956]: I0314 09:44:02.191721 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.293833553 podStartE2EDuration="5.191701052s" podCreationTimestamp="2026-03-14 09:43:57 +0000 UTC" firstStartedPulling="2026-03-14 09:43:57.939917855 +0000 UTC m=+2843.452610123" lastFinishedPulling="2026-03-14 09:44:01.837785354 +0000 UTC m=+2847.350477622" observedRunningTime="2026-03-14 09:44:02.189821964 +0000 UTC m=+2847.702514242" watchObservedRunningTime="2026-03-14 09:44:02.191701052 +0000 UTC m=+2847.704393320" Mar 14 09:44:03 crc kubenswrapper[4956]: I0314 09:44:03.165539 4956 generic.go:334] "Generic (PLEG): container finished" podID="876cec90-17ca-4a4c-8cbc-462048c32b1c" containerID="41b3a68e9782fa6a945a3081a4b8ec44473c1c14b30979c5d702fddcd8ba6913" exitCode=0 Mar 14 09:44:03 crc kubenswrapper[4956]: I0314 09:44:03.166612 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-slxr8" event={"ID":"876cec90-17ca-4a4c-8cbc-462048c32b1c","Type":"ContainerDied","Data":"41b3a68e9782fa6a945a3081a4b8ec44473c1c14b30979c5d702fddcd8ba6913"} Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.468439 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.574539 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnjpl\" (UniqueName: \"kubernetes.io/projected/876cec90-17ca-4a4c-8cbc-462048c32b1c-kube-api-access-rnjpl\") pod \"876cec90-17ca-4a4c-8cbc-462048c32b1c\" (UID: \"876cec90-17ca-4a4c-8cbc-462048c32b1c\") " Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.582974 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876cec90-17ca-4a4c-8cbc-462048c32b1c-kube-api-access-rnjpl" (OuterVolumeSpecName: "kube-api-access-rnjpl") pod "876cec90-17ca-4a4c-8cbc-462048c32b1c" (UID: "876cec90-17ca-4a4c-8cbc-462048c32b1c"). InnerVolumeSpecName "kube-api-access-rnjpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.583102 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Mar 14 09:44:04 crc kubenswrapper[4956]: E0314 09:44:04.583828 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876cec90-17ca-4a4c-8cbc-462048c32b1c" containerName="oc" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.583855 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="876cec90-17ca-4a4c-8cbc-462048c32b1c" containerName="oc" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.584096 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="876cec90-17ca-4a4c-8cbc-462048c32b1c" containerName="oc" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.585304 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.595861 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677036 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvrv\" (UniqueName: \"kubernetes.io/projected/e1808106-4778-41e2-9aa0-e3012f91255d-kube-api-access-fdvrv\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677077 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677107 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677156 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1808106-4778-41e2-9aa0-e3012f91255d-logs\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677274 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.677690 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnjpl\" (UniqueName: \"kubernetes.io/projected/876cec90-17ca-4a4c-8cbc-462048c32b1c-kube-api-access-rnjpl\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.779650 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.779712 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvrv\" (UniqueName: \"kubernetes.io/projected/e1808106-4778-41e2-9aa0-e3012f91255d-kube-api-access-fdvrv\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.779732 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.779757 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.779807 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1808106-4778-41e2-9aa0-e3012f91255d-logs\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.779834 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.780338 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1808106-4778-41e2-9aa0-e3012f91255d-logs\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.784226 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.784382 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.785343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.792331 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.795596 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvrv\" (UniqueName: \"kubernetes.io/projected/e1808106-4778-41e2-9aa0-e3012f91255d-kube-api-access-fdvrv\") pod \"watcher-kuttl-api-2\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:04 crc kubenswrapper[4956]: I0314 09:44:04.938995 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:05 crc kubenswrapper[4956]: I0314 09:44:05.187871 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-slxr8" event={"ID":"876cec90-17ca-4a4c-8cbc-462048c32b1c","Type":"ContainerDied","Data":"4c2710921640f374dd6bfdb61318d13bb0f90f7f4c706c215b69db4e629f1c41"} Mar 14 09:44:05 crc kubenswrapper[4956]: I0314 09:44:05.187916 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2710921640f374dd6bfdb61318d13bb0f90f7f4c706c215b69db4e629f1c41" Mar 14 09:44:05 crc kubenswrapper[4956]: I0314 09:44:05.187966 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-slxr8" Mar 14 09:44:05 crc kubenswrapper[4956]: I0314 09:44:05.235981 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-qrkhg"] Mar 14 09:44:05 crc kubenswrapper[4956]: I0314 09:44:05.245340 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-qrkhg"] Mar 14 09:44:05 crc kubenswrapper[4956]: I0314 09:44:05.369783 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Mar 14 09:44:05 crc kubenswrapper[4956]: W0314 09:44:05.371639 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1808106_4778_41e2_9aa0_e3012f91255d.slice/crio-dedf1ec44cd79bc91e73ec12811bfcd368f9a9d0b8d0757ef8247e13b64732d2 WatchSource:0}: Error finding container dedf1ec44cd79bc91e73ec12811bfcd368f9a9d0b8d0757ef8247e13b64732d2: Status 404 returned error can't find the container with id dedf1ec44cd79bc91e73ec12811bfcd368f9a9d0b8d0757ef8247e13b64732d2 Mar 14 09:44:06 crc kubenswrapper[4956]: I0314 09:44:06.198707 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"e1808106-4778-41e2-9aa0-e3012f91255d","Type":"ContainerStarted","Data":"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc"} Mar 14 09:44:06 crc kubenswrapper[4956]: I0314 09:44:06.198987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"e1808106-4778-41e2-9aa0-e3012f91255d","Type":"ContainerStarted","Data":"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4"} Mar 14 09:44:06 crc kubenswrapper[4956]: I0314 09:44:06.199004 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"e1808106-4778-41e2-9aa0-e3012f91255d","Type":"ContainerStarted","Data":"dedf1ec44cd79bc91e73ec12811bfcd368f9a9d0b8d0757ef8247e13b64732d2"} Mar 14 09:44:06 crc kubenswrapper[4956]: I0314 09:44:06.199356 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:06 crc kubenswrapper[4956]: I0314 09:44:06.227525 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-2" podStartSLOduration=2.227504342 podStartE2EDuration="2.227504342s" podCreationTimestamp="2026-03-14 09:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:44:06.218904634 +0000 UTC m=+2851.731596912" watchObservedRunningTime="2026-03-14 09:44:06.227504342 +0000 UTC m=+2851.740196610" Mar 14 09:44:07 crc kubenswrapper[4956]: I0314 09:44:07.221029 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db0f53f-2121-43af-bffe-84fdafb9817a" path="/var/lib/kubelet/pods/6db0f53f-2121-43af-bffe-84fdafb9817a/volumes" Mar 14 09:44:08 crc kubenswrapper[4956]: I0314 09:44:08.504087 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:09 crc kubenswrapper[4956]: I0314 09:44:09.939940 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:14 crc kubenswrapper[4956]: I0314 09:44:14.939889 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:14 crc kubenswrapper[4956]: I0314 09:44:14.943903 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:15 crc kubenswrapper[4956]: I0314 09:44:15.282311 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:16 crc kubenswrapper[4956]: I0314 09:44:16.098940 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Mar 14 09:44:16 crc kubenswrapper[4956]: I0314 09:44:16.108441 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:44:16 crc kubenswrapper[4956]: I0314 09:44:16.108733 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-kuttl-api-log" containerID="cri-o://1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa" gracePeriod=30 Mar 14 09:44:16 crc kubenswrapper[4956]: I0314 09:44:16.109198 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-api" containerID="cri-o://d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9" gracePeriod=30 Mar 14 09:44:16 crc kubenswrapper[4956]: I0314 09:44:16.289175 4956 generic.go:334] "Generic (PLEG): container finished" podID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerID="1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa" exitCode=143 Mar 14 09:44:16 crc kubenswrapper[4956]: I0314 09:44:16.289228 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b5f047b6-f285-411b-9dcc-1883a2bbb1b0","Type":"ContainerDied","Data":"1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa"} Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.089733 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.171395 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-combined-ca-bundle\") pod \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.171547 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-custom-prometheus-ca\") pod \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.171637 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-cert-memcached-mtls\") pod \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.171665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4497\" (UniqueName: \"kubernetes.io/projected/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-kube-api-access-w4497\") pod \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.171693 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-logs\") pod \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.171716 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-config-data\") pod \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\" (UID: \"b5f047b6-f285-411b-9dcc-1883a2bbb1b0\") " Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.172770 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-logs" (OuterVolumeSpecName: "logs") pod "b5f047b6-f285-411b-9dcc-1883a2bbb1b0" (UID: "b5f047b6-f285-411b-9dcc-1883a2bbb1b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.177475 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-kube-api-access-w4497" (OuterVolumeSpecName: "kube-api-access-w4497") pod "b5f047b6-f285-411b-9dcc-1883a2bbb1b0" (UID: "b5f047b6-f285-411b-9dcc-1883a2bbb1b0"). InnerVolumeSpecName "kube-api-access-w4497". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.197306 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5f047b6-f285-411b-9dcc-1883a2bbb1b0" (UID: "b5f047b6-f285-411b-9dcc-1883a2bbb1b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.198515 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b5f047b6-f285-411b-9dcc-1883a2bbb1b0" (UID: "b5f047b6-f285-411b-9dcc-1883a2bbb1b0"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.216731 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-config-data" (OuterVolumeSpecName: "config-data") pod "b5f047b6-f285-411b-9dcc-1883a2bbb1b0" (UID: "b5f047b6-f285-411b-9dcc-1883a2bbb1b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.251531 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "b5f047b6-f285-411b-9dcc-1883a2bbb1b0" (UID: "b5f047b6-f285-411b-9dcc-1883a2bbb1b0"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.274444 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.274502 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.274519 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.274531 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.274541 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4497\" (UniqueName: \"kubernetes.io/projected/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-kube-api-access-w4497\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.274553 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f047b6-f285-411b-9dcc-1883a2bbb1b0-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.299784 4956 generic.go:334] "Generic (PLEG): container finished" podID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerID="d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9" exitCode=0 Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.299876 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.299907 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b5f047b6-f285-411b-9dcc-1883a2bbb1b0","Type":"ContainerDied","Data":"d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9"} Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.299980 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b5f047b6-f285-411b-9dcc-1883a2bbb1b0","Type":"ContainerDied","Data":"c1e704b69da4ed696efe2f2c12e41263c5f239d6debb0238a172f77b1cffc6e0"} Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.300005 4956 scope.go:117] "RemoveContainer" containerID="d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.300475 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-kuttl-api-log" containerID="cri-o://2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4" gracePeriod=30 Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.300604 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-api" containerID="cri-o://604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc" gracePeriod=30 Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.332670 4956 scope.go:117] "RemoveContainer" containerID="1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.334555 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.353598 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.357653 4956 scope.go:117] "RemoveContainer" containerID="d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9" Mar 14 09:44:17 crc kubenswrapper[4956]: E0314 09:44:17.358120 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9\": container with ID starting with d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9 not found: ID does not exist" containerID="d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.358174 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9"} err="failed to get container status \"d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9\": rpc error: code = NotFound desc = could not find container \"d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9\": container with ID starting with d580e025fe4dcfb7cf2985d7c067431e022c13c00a5a572b4d9d5422ef81ddd9 not found: ID does not exist" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.358197 4956 scope.go:117] "RemoveContainer" containerID="1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa" Mar 14 09:44:17 crc kubenswrapper[4956]: E0314 09:44:17.360174 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa\": container with ID starting with 1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa not found: ID does not exist" containerID="1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa" Mar 14 09:44:17 crc kubenswrapper[4956]: I0314 09:44:17.360216 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa"} err="failed to get container status \"1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa\": rpc error: code = NotFound desc = could not find container \"1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa\": container with ID starting with 1d88463826e50ac0b1808e7ae6e5ee0c36fc26eed22a660e8a0923089f81ecfa not found: ID does not exist" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.089524 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.189656 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvrv\" (UniqueName: \"kubernetes.io/projected/e1808106-4778-41e2-9aa0-e3012f91255d-kube-api-access-fdvrv\") pod \"e1808106-4778-41e2-9aa0-e3012f91255d\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.189726 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-config-data\") pod \"e1808106-4778-41e2-9aa0-e3012f91255d\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.189819 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-combined-ca-bundle\") pod \"e1808106-4778-41e2-9aa0-e3012f91255d\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.189890 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-cert-memcached-mtls\") pod \"e1808106-4778-41e2-9aa0-e3012f91255d\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.189930 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-custom-prometheus-ca\") pod \"e1808106-4778-41e2-9aa0-e3012f91255d\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.189958 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1808106-4778-41e2-9aa0-e3012f91255d-logs\") pod \"e1808106-4778-41e2-9aa0-e3012f91255d\" (UID: \"e1808106-4778-41e2-9aa0-e3012f91255d\") " Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.190631 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1808106-4778-41e2-9aa0-e3012f91255d-logs" (OuterVolumeSpecName: "logs") pod "e1808106-4778-41e2-9aa0-e3012f91255d" (UID: "e1808106-4778-41e2-9aa0-e3012f91255d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.194264 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1808106-4778-41e2-9aa0-e3012f91255d-kube-api-access-fdvrv" (OuterVolumeSpecName: "kube-api-access-fdvrv") pod "e1808106-4778-41e2-9aa0-e3012f91255d" (UID: "e1808106-4778-41e2-9aa0-e3012f91255d"). InnerVolumeSpecName "kube-api-access-fdvrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.211687 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1808106-4778-41e2-9aa0-e3012f91255d" (UID: "e1808106-4778-41e2-9aa0-e3012f91255d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.211765 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e1808106-4778-41e2-9aa0-e3012f91255d" (UID: "e1808106-4778-41e2-9aa0-e3012f91255d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.228793 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-config-data" (OuterVolumeSpecName: "config-data") pod "e1808106-4778-41e2-9aa0-e3012f91255d" (UID: "e1808106-4778-41e2-9aa0-e3012f91255d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.246904 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e1808106-4778-41e2-9aa0-e3012f91255d" (UID: "e1808106-4778-41e2-9aa0-e3012f91255d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.292124 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.292154 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.292164 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.292174 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1808106-4778-41e2-9aa0-e3012f91255d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.292184 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvrv\" (UniqueName: \"kubernetes.io/projected/e1808106-4778-41e2-9aa0-e3012f91255d-kube-api-access-fdvrv\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.292195 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1808106-4778-41e2-9aa0-e3012f91255d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312155 4956 generic.go:334] "Generic (PLEG): container finished" podID="e1808106-4778-41e2-9aa0-e3012f91255d" containerID="604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc" exitCode=0 Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312191 4956 generic.go:334] "Generic (PLEG): container finished" podID="e1808106-4778-41e2-9aa0-e3012f91255d" containerID="2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4" exitCode=143 Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312213 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"e1808106-4778-41e2-9aa0-e3012f91255d","Type":"ContainerDied","Data":"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc"} Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312232 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312245 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"e1808106-4778-41e2-9aa0-e3012f91255d","Type":"ContainerDied","Data":"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4"} Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312256 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"e1808106-4778-41e2-9aa0-e3012f91255d","Type":"ContainerDied","Data":"dedf1ec44cd79bc91e73ec12811bfcd368f9a9d0b8d0757ef8247e13b64732d2"} Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.312273 4956 scope.go:117] "RemoveContainer" containerID="604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.329657 4956 scope.go:117] "RemoveContainer" containerID="2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.342343 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.351224 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.355390 4956 scope.go:117] "RemoveContainer" containerID="604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc" Mar 14 09:44:18 crc kubenswrapper[4956]: E0314 09:44:18.355964 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc\": container with ID starting with 604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc not found: ID does not exist" containerID="604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.355999 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc"} err="failed to get container status \"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc\": rpc error: code = NotFound desc = could not find container \"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc\": container with ID starting with 604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc not found: ID does not exist" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.356023 4956 scope.go:117] "RemoveContainer" containerID="2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4" Mar 14 09:44:18 crc kubenswrapper[4956]: E0314 09:44:18.356515 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4\": container with ID starting with 2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4 not found: ID does not exist" containerID="2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.356607 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4"} err="failed to get container status \"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4\": rpc error: code = NotFound desc = could not find container \"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4\": container with ID starting with 2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4 not found: ID does not exist" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.356688 4956 scope.go:117] "RemoveContainer" containerID="604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.357047 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc"} err="failed to get container status \"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc\": rpc error: code = NotFound desc = could not find container \"604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc\": container with ID starting with 604c218f4792b484e276de097dfff53d65bec6f985e6a4e8b114bfe6ee1abbcc not found: ID does not exist" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.357122 4956 scope.go:117] "RemoveContainer" containerID="2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4" Mar 14 09:44:18 crc kubenswrapper[4956]: I0314 09:44:18.357465 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4"} err="failed to get container status \"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4\": rpc error: code = NotFound desc = could not find container \"2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4\": container with ID starting with 2aba0c04bd28d5f9cf2252176c3bd5032a670628577aa9f774c74cd291664cb4 not found: ID does not exist" Mar 14 09:44:19 crc kubenswrapper[4956]: I0314 09:44:19.218924 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" path="/var/lib/kubelet/pods/b5f047b6-f285-411b-9dcc-1883a2bbb1b0/volumes" Mar 14 09:44:19 crc kubenswrapper[4956]: I0314 09:44:19.219669 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" path="/var/lib/kubelet/pods/e1808106-4778-41e2-9aa0-e3012f91255d/volumes" Mar 14 09:44:19 crc kubenswrapper[4956]: I0314 09:44:19.354530 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:44:19 crc kubenswrapper[4956]: I0314 09:44:19.355552 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-kuttl-api-log" containerID="cri-o://2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18" gracePeriod=30 Mar 14 09:44:19 crc kubenswrapper[4956]: I0314 09:44:19.362538 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-api" containerID="cri-o://4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044" gracePeriod=30 Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.199921 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.324516 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/ff8672e3-e8f7-4738-9925-9945e4db581e-kube-api-access-bl4j4\") pod \"ff8672e3-e8f7-4738-9925-9945e4db581e\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.324940 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-combined-ca-bundle\") pod \"ff8672e3-e8f7-4738-9925-9945e4db581e\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.325015 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8672e3-e8f7-4738-9925-9945e4db581e-logs\") pod \"ff8672e3-e8f7-4738-9925-9945e4db581e\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.325115 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-custom-prometheus-ca\") pod \"ff8672e3-e8f7-4738-9925-9945e4db581e\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.325239 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-config-data\") pod \"ff8672e3-e8f7-4738-9925-9945e4db581e\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.325297 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-cert-memcached-mtls\") pod \"ff8672e3-e8f7-4738-9925-9945e4db581e\" (UID: \"ff8672e3-e8f7-4738-9925-9945e4db581e\") " Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.326742 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8672e3-e8f7-4738-9925-9945e4db581e-logs" (OuterVolumeSpecName: "logs") pod "ff8672e3-e8f7-4738-9925-9945e4db581e" (UID: "ff8672e3-e8f7-4738-9925-9945e4db581e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330432 4956 generic.go:334] "Generic (PLEG): container finished" podID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerID="4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044" exitCode=0 Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330560 4956 generic.go:334] "Generic (PLEG): container finished" podID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerID="2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18" exitCode=143 Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330532 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff8672e3-e8f7-4738-9925-9945e4db581e","Type":"ContainerDied","Data":"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044"} Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330596 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff8672e3-e8f7-4738-9925-9945e4db581e","Type":"ContainerDied","Data":"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18"} Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330609 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff8672e3-e8f7-4738-9925-9945e4db581e","Type":"ContainerDied","Data":"21b654a5e616109b65d50b0cad1cf5ab15d41040e9fc629a7c02d510ca0040d1"} Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330626 4956 scope.go:117] "RemoveContainer" containerID="4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.330545 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.331304 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8672e3-e8f7-4738-9925-9945e4db581e-kube-api-access-bl4j4" (OuterVolumeSpecName: "kube-api-access-bl4j4") pod "ff8672e3-e8f7-4738-9925-9945e4db581e" (UID: "ff8672e3-e8f7-4738-9925-9945e4db581e"). InnerVolumeSpecName "kube-api-access-bl4j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.359415 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ff8672e3-e8f7-4738-9925-9945e4db581e" (UID: "ff8672e3-e8f7-4738-9925-9945e4db581e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.363880 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff8672e3-e8f7-4738-9925-9945e4db581e" (UID: "ff8672e3-e8f7-4738-9925-9945e4db581e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.383623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-config-data" (OuterVolumeSpecName: "config-data") pod "ff8672e3-e8f7-4738-9925-9945e4db581e" (UID: "ff8672e3-e8f7-4738-9925-9945e4db581e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.394941 4956 scope.go:117] "RemoveContainer" containerID="2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.414610 4956 scope.go:117] "RemoveContainer" containerID="4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.415066 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044\": container with ID starting with 4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044 not found: ID does not exist" containerID="4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.415120 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044"} err="failed to get container status \"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044\": rpc error: code = NotFound desc = could not find container \"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044\": container with ID starting with 4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044 not found: ID does not exist" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.415145 4956 scope.go:117] "RemoveContainer" containerID="2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.415478 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18\": container with ID starting with 2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18 not found: ID does not exist" containerID="2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.415567 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18"} err="failed to get container status \"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18\": rpc error: code = NotFound desc = could not find container \"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18\": container with ID starting with 2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18 not found: ID does not exist" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.415582 4956 scope.go:117] "RemoveContainer" containerID="4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.415970 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044"} err="failed to get container status \"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044\": rpc error: code = NotFound desc = could not find container \"4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044\": container with ID starting with 4c3ccb35e7da1308ab518005891bca200ceea4c72ec8d457b4f42cc1fe3be044 not found: ID does not exist" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.416000 4956 scope.go:117] "RemoveContainer" containerID="2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.416261 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18"} err="failed to get container status \"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18\": rpc error: code = NotFound desc = could not find container \"2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18\": container with ID starting with 2ab5ae2b023864124d5a37a6e513af752971570225f249d114a612e7c5e2da18 not found: ID does not exist" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.425047 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ff8672e3-e8f7-4738-9925-9945e4db581e" (UID: "ff8672e3-e8f7-4738-9925-9945e4db581e"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.428166 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl4j4\" (UniqueName: \"kubernetes.io/projected/ff8672e3-e8f7-4738-9925-9945e4db581e-kube-api-access-bl4j4\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.428209 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.428221 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8672e3-e8f7-4738-9925-9945e4db581e-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.428271 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.428281 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.428291 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff8672e3-e8f7-4738-9925-9945e4db581e-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.598572 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-xrs27"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.607083 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-xrs27"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.731744 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.738347 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.738610 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="2decb267-0305-4e3e-b505-cc929640d7a8" containerName="watcher-applier" containerID="cri-o://ed722effbccaaad2fa2c1831f9438c159b8fb25f2e5da4988970b64f58197ea3" gracePeriod=30 Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.756927 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785342 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher6a49-account-delete-vlk68"] Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.785721 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785737 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.785750 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785757 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.785777 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785783 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.785792 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785798 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.785811 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785816 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: E0314 09:44:20.785830 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.785836 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786005 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786026 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786038 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1808106-4778-41e2-9aa0-e3012f91255d" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786048 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-kuttl-api-log" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786057 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786065 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f047b6-f285-411b-9dcc-1883a2bbb1b0" containerName="watcher-api" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.786702 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.835915 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/820c24eb-8503-4e7f-910e-98d300034b0e-operator-scripts\") pod \"watcher6a49-account-delete-vlk68\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.836043 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7p55\" (UniqueName: \"kubernetes.io/projected/820c24eb-8503-4e7f-910e-98d300034b0e-kube-api-access-k7p55\") pod \"watcher6a49-account-delete-vlk68\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.899907 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher6a49-account-delete-vlk68"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.905247 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.905468 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="92102f18-f0db-441b-a147-a8fab678a887" containerName="watcher-decision-engine" containerID="cri-o://20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528" gracePeriod=30 Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.937755 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7p55\" (UniqueName: \"kubernetes.io/projected/820c24eb-8503-4e7f-910e-98d300034b0e-kube-api-access-k7p55\") pod \"watcher6a49-account-delete-vlk68\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.937849 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/820c24eb-8503-4e7f-910e-98d300034b0e-operator-scripts\") pod \"watcher6a49-account-delete-vlk68\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.938563 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/820c24eb-8503-4e7f-910e-98d300034b0e-operator-scripts\") pod \"watcher6a49-account-delete-vlk68\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:20 crc kubenswrapper[4956]: I0314 09:44:20.963141 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7p55\" (UniqueName: \"kubernetes.io/projected/820c24eb-8503-4e7f-910e-98d300034b0e-kube-api-access-k7p55\") pod \"watcher6a49-account-delete-vlk68\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:21 crc kubenswrapper[4956]: I0314 09:44:21.101174 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:21 crc kubenswrapper[4956]: I0314 09:44:21.222234 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b2e2cb-91b3-498b-978c-18fb76d846b4" path="/var/lib/kubelet/pods/e6b2e2cb-91b3-498b-978c-18fb76d846b4/volumes" Mar 14 09:44:21 crc kubenswrapper[4956]: I0314 09:44:21.223075 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8672e3-e8f7-4738-9925-9945e4db581e" path="/var/lib/kubelet/pods/ff8672e3-e8f7-4738-9925-9945e4db581e/volumes" Mar 14 09:44:21 crc kubenswrapper[4956]: I0314 09:44:21.641373 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher6a49-account-delete-vlk68"] Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.364413 4956 generic.go:334] "Generic (PLEG): container finished" podID="2decb267-0305-4e3e-b505-cc929640d7a8" containerID="ed722effbccaaad2fa2c1831f9438c159b8fb25f2e5da4988970b64f58197ea3" exitCode=0 Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.364496 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2decb267-0305-4e3e-b505-cc929640d7a8","Type":"ContainerDied","Data":"ed722effbccaaad2fa2c1831f9438c159b8fb25f2e5da4988970b64f58197ea3"} Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.366400 4956 generic.go:334] "Generic (PLEG): container finished" podID="820c24eb-8503-4e7f-910e-98d300034b0e" containerID="88cb6cb2d57cffc62756b9e28f4edd982d6ceac63b0cbd88465100267e440288" exitCode=0 Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.366450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" event={"ID":"820c24eb-8503-4e7f-910e-98d300034b0e","Type":"ContainerDied","Data":"88cb6cb2d57cffc62756b9e28f4edd982d6ceac63b0cbd88465100267e440288"} Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.367300 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" event={"ID":"820c24eb-8503-4e7f-910e-98d300034b0e","Type":"ContainerStarted","Data":"503143ac54c03d0002c871c597888f32067f3166c5bc3ec6d7773d72c68d2e99"} Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.479599 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.568538 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-cert-memcached-mtls\") pod \"2decb267-0305-4e3e-b505-cc929640d7a8\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.568585 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2decb267-0305-4e3e-b505-cc929640d7a8-logs\") pod \"2decb267-0305-4e3e-b505-cc929640d7a8\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.568664 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk9rd\" (UniqueName: \"kubernetes.io/projected/2decb267-0305-4e3e-b505-cc929640d7a8-kube-api-access-jk9rd\") pod \"2decb267-0305-4e3e-b505-cc929640d7a8\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.568754 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-config-data\") pod \"2decb267-0305-4e3e-b505-cc929640d7a8\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.568788 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-combined-ca-bundle\") pod \"2decb267-0305-4e3e-b505-cc929640d7a8\" (UID: \"2decb267-0305-4e3e-b505-cc929640d7a8\") " Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.570856 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2decb267-0305-4e3e-b505-cc929640d7a8-logs" (OuterVolumeSpecName: "logs") pod "2decb267-0305-4e3e-b505-cc929640d7a8" (UID: "2decb267-0305-4e3e-b505-cc929640d7a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.574562 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2decb267-0305-4e3e-b505-cc929640d7a8-kube-api-access-jk9rd" (OuterVolumeSpecName: "kube-api-access-jk9rd") pod "2decb267-0305-4e3e-b505-cc929640d7a8" (UID: "2decb267-0305-4e3e-b505-cc929640d7a8"). InnerVolumeSpecName "kube-api-access-jk9rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.603384 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2decb267-0305-4e3e-b505-cc929640d7a8" (UID: "2decb267-0305-4e3e-b505-cc929640d7a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.615022 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-config-data" (OuterVolumeSpecName: "config-data") pod "2decb267-0305-4e3e-b505-cc929640d7a8" (UID: "2decb267-0305-4e3e-b505-cc929640d7a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.642244 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "2decb267-0305-4e3e-b505-cc929640d7a8" (UID: "2decb267-0305-4e3e-b505-cc929640d7a8"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.670670 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.670703 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.670715 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2decb267-0305-4e3e-b505-cc929640d7a8-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.670725 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2decb267-0305-4e3e-b505-cc929640d7a8-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:22 crc kubenswrapper[4956]: I0314 09:44:22.670734 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk9rd\" (UniqueName: \"kubernetes.io/projected/2decb267-0305-4e3e-b505-cc929640d7a8-kube-api-access-jk9rd\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.050749 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.189950 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w7d6\" (UniqueName: \"kubernetes.io/projected/92102f18-f0db-441b-a147-a8fab678a887-kube-api-access-7w7d6\") pod \"92102f18-f0db-441b-a147-a8fab678a887\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.190093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-combined-ca-bundle\") pod \"92102f18-f0db-441b-a147-a8fab678a887\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.190127 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-cert-memcached-mtls\") pod \"92102f18-f0db-441b-a147-a8fab678a887\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.190155 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-config-data\") pod \"92102f18-f0db-441b-a147-a8fab678a887\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.190245 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-custom-prometheus-ca\") pod \"92102f18-f0db-441b-a147-a8fab678a887\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.190321 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92102f18-f0db-441b-a147-a8fab678a887-logs\") pod \"92102f18-f0db-441b-a147-a8fab678a887\" (UID: \"92102f18-f0db-441b-a147-a8fab678a887\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.190787 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92102f18-f0db-441b-a147-a8fab678a887-logs" (OuterVolumeSpecName: "logs") pod "92102f18-f0db-441b-a147-a8fab678a887" (UID: "92102f18-f0db-441b-a147-a8fab678a887"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.191223 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92102f18-f0db-441b-a147-a8fab678a887-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.198672 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92102f18-f0db-441b-a147-a8fab678a887-kube-api-access-7w7d6" (OuterVolumeSpecName: "kube-api-access-7w7d6") pod "92102f18-f0db-441b-a147-a8fab678a887" (UID: "92102f18-f0db-441b-a147-a8fab678a887"). InnerVolumeSpecName "kube-api-access-7w7d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.214390 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92102f18-f0db-441b-a147-a8fab678a887" (UID: "92102f18-f0db-441b-a147-a8fab678a887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.215385 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "92102f18-f0db-441b-a147-a8fab678a887" (UID: "92102f18-f0db-441b-a147-a8fab678a887"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.244425 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-config-data" (OuterVolumeSpecName: "config-data") pod "92102f18-f0db-441b-a147-a8fab678a887" (UID: "92102f18-f0db-441b-a147-a8fab678a887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.274098 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "92102f18-f0db-441b-a147-a8fab678a887" (UID: "92102f18-f0db-441b-a147-a8fab678a887"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.293995 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.294042 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w7d6\" (UniqueName: \"kubernetes.io/projected/92102f18-f0db-441b-a147-a8fab678a887-kube-api-access-7w7d6\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.294057 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.294131 4956 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.294175 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92102f18-f0db-441b-a147-a8fab678a887-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.377932 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2decb267-0305-4e3e-b505-cc929640d7a8","Type":"ContainerDied","Data":"8ad938bfe9b705cd10a49e38a8207ac74791c502ea8e227ab81ee69ab22162c7"} Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.378022 4956 scope.go:117] "RemoveContainer" containerID="ed722effbccaaad2fa2c1831f9438c159b8fb25f2e5da4988970b64f58197ea3" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.377975 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.380140 4956 generic.go:334] "Generic (PLEG): container finished" podID="92102f18-f0db-441b-a147-a8fab678a887" containerID="20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528" exitCode=0 Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.380424 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.380427 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"92102f18-f0db-441b-a147-a8fab678a887","Type":"ContainerDied","Data":"20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528"} Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.380500 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"92102f18-f0db-441b-a147-a8fab678a887","Type":"ContainerDied","Data":"eb6ed7a3472ac5ca6854d9bc1bbe1f0fb6a8ef1bf7613d00c9449503650056b8"} Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.397941 4956 scope.go:117] "RemoveContainer" containerID="20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.427445 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.438222 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.442759 4956 scope.go:117] "RemoveContainer" containerID="20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528" Mar 14 09:44:23 crc kubenswrapper[4956]: E0314 09:44:23.444016 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528\": container with ID starting with 20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528 not found: ID does not exist" containerID="20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.444058 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528"} err="failed to get container status \"20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528\": rpc error: code = NotFound desc = could not find container \"20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528\": container with ID starting with 20afb1dc8be257cc26f7540c416a1f2b88b65c1008898fd90b47626350dde528 not found: ID does not exist" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.446080 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.453176 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.724342 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.800854 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7p55\" (UniqueName: \"kubernetes.io/projected/820c24eb-8503-4e7f-910e-98d300034b0e-kube-api-access-k7p55\") pod \"820c24eb-8503-4e7f-910e-98d300034b0e\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.801004 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/820c24eb-8503-4e7f-910e-98d300034b0e-operator-scripts\") pod \"820c24eb-8503-4e7f-910e-98d300034b0e\" (UID: \"820c24eb-8503-4e7f-910e-98d300034b0e\") " Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.801893 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/820c24eb-8503-4e7f-910e-98d300034b0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "820c24eb-8503-4e7f-910e-98d300034b0e" (UID: "820c24eb-8503-4e7f-910e-98d300034b0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.817274 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820c24eb-8503-4e7f-910e-98d300034b0e-kube-api-access-k7p55" (OuterVolumeSpecName: "kube-api-access-k7p55") pod "820c24eb-8503-4e7f-910e-98d300034b0e" (UID: "820c24eb-8503-4e7f-910e-98d300034b0e"). InnerVolumeSpecName "kube-api-access-k7p55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.902636 4956 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/820c24eb-8503-4e7f-910e-98d300034b0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:23 crc kubenswrapper[4956]: I0314 09:44:23.902980 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7p55\" (UniqueName: \"kubernetes.io/projected/820c24eb-8503-4e7f-910e-98d300034b0e-kube-api-access-k7p55\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.136897 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.137242 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-central-agent" containerID="cri-o://a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868" gracePeriod=30 Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.137301 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="sg-core" containerID="cri-o://bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57" gracePeriod=30 Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.137351 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-notification-agent" containerID="cri-o://89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c" gracePeriod=30 Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.137303 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="proxy-httpd" containerID="cri-o://7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2" gracePeriod=30 Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.162023 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.391275 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerID="7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2" exitCode=0 Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.391311 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerID="bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57" exitCode=2 Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.391348 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerDied","Data":"7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2"} Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.391390 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerDied","Data":"bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57"} Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.396193 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" event={"ID":"820c24eb-8503-4e7f-910e-98d300034b0e","Type":"ContainerDied","Data":"503143ac54c03d0002c871c597888f32067f3166c5bc3ec6d7773d72c68d2e99"} Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.396242 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="503143ac54c03d0002c871c597888f32067f3166c5bc3ec6d7773d72c68d2e99" Mar 14 09:44:24 crc kubenswrapper[4956]: I0314 09:44:24.396216 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher6a49-account-delete-vlk68" Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.219512 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2decb267-0305-4e3e-b505-cc929640d7a8" path="/var/lib/kubelet/pods/2decb267-0305-4e3e-b505-cc929640d7a8/volumes" Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.220405 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92102f18-f0db-441b-a147-a8fab678a887" path="/var/lib/kubelet/pods/92102f18-f0db-441b-a147-a8fab678a887/volumes" Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.409547 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerID="a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868" exitCode=0 Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.409596 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerDied","Data":"a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868"} Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.768719 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ngj29"] Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.777649 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-ngj29"] Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.793512 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-6a49-account-create-update-82rm9"] Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.801695 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher6a49-account-delete-vlk68"] Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.809814 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-6a49-account-create-update-82rm9"] Mar 14 09:44:25 crc kubenswrapper[4956]: I0314 09:44:25.816183 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher6a49-account-delete-vlk68"] Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.358215 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.418978 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerID="89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c" exitCode=0 Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.419276 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerDied","Data":"89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c"} Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.419310 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7dc3d756-2060-4570-a9e8-693e99a932b9","Type":"ContainerDied","Data":"46a43b0a98a342203345f2fd7655551626a5ef60a737700df5aef9160bf8adb2"} Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.419330 4956 scope.go:117] "RemoveContainer" containerID="7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.419478 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.439817 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9k4r\" (UniqueName: \"kubernetes.io/projected/7dc3d756-2060-4570-a9e8-693e99a932b9-kube-api-access-s9k4r\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.440139 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-scripts\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.440288 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-ceilometer-tls-certs\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441009 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-log-httpd\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441107 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-combined-ca-bundle\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441181 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-config-data\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441247 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441006 4956 scope.go:117] "RemoveContainer" containerID="bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441357 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-run-httpd\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.441456 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-sg-core-conf-yaml\") pod \"7dc3d756-2060-4570-a9e8-693e99a932b9\" (UID: \"7dc3d756-2060-4570-a9e8-693e99a932b9\") " Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.442145 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.445189 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.445664 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc3d756-2060-4570-a9e8-693e99a932b9-kube-api-access-s9k4r" (OuterVolumeSpecName: "kube-api-access-s9k4r") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "kube-api-access-s9k4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.448595 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-scripts" (OuterVolumeSpecName: "scripts") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.468754 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.488397 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.504456 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.531861 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-config-data" (OuterVolumeSpecName: "config-data") pod "7dc3d756-2060-4570-a9e8-693e99a932b9" (UID: "7dc3d756-2060-4570-a9e8-693e99a932b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.535824 4956 scope.go:117] "RemoveContainer" containerID="89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543173 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543376 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543463 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dc3d756-2060-4570-a9e8-693e99a932b9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543593 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543673 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9k4r\" (UniqueName: \"kubernetes.io/projected/7dc3d756-2060-4570-a9e8-693e99a932b9-kube-api-access-s9k4r\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543728 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.543779 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dc3d756-2060-4570-a9e8-693e99a932b9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.586503 4956 scope.go:117] "RemoveContainer" containerID="a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.603921 4956 scope.go:117] "RemoveContainer" containerID="7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.604185 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2\": container with ID starting with 7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2 not found: ID does not exist" containerID="7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.604223 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2"} err="failed to get container status \"7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2\": rpc error: code = NotFound desc = could not find container \"7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2\": container with ID starting with 7d97e71b68ada691a7cb95f4c3a11acacb9f90384df137f46bdba39f6c07e8e2 not found: ID does not exist" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.604246 4956 scope.go:117] "RemoveContainer" containerID="bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.604639 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57\": container with ID starting with bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57 not found: ID does not exist" containerID="bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.604670 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57"} err="failed to get container status \"bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57\": rpc error: code = NotFound desc = could not find container \"bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57\": container with ID starting with bf8f69e724e5dbb841366fdc3f4c1eeac5be8745a8132ea072121f152f815b57 not found: ID does not exist" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.604696 4956 scope.go:117] "RemoveContainer" containerID="89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.604921 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c\": container with ID starting with 89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c not found: ID does not exist" containerID="89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.604942 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c"} err="failed to get container status \"89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c\": rpc error: code = NotFound desc = could not find container \"89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c\": container with ID starting with 89f1ecb1dc3f16f01f4b35f4b32eefcd9e4473de8d47a25593825b2f0556b50c not found: ID does not exist" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.604955 4956 scope.go:117] "RemoveContainer" containerID="a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.605163 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868\": container with ID starting with a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868 not found: ID does not exist" containerID="a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.605181 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868"} err="failed to get container status \"a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868\": rpc error: code = NotFound desc = could not find container \"a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868\": container with ID starting with a944e4c4c37ad3c91848e25c44e642606e43d23133df417a265807b1f61d4868 not found: ID does not exist" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.749246 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.755528 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768347 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768750 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-notification-agent" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768775 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-notification-agent" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768789 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="sg-core" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768800 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="sg-core" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768818 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92102f18-f0db-441b-a147-a8fab678a887" containerName="watcher-decision-engine" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768827 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="92102f18-f0db-441b-a147-a8fab678a887" containerName="watcher-decision-engine" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768847 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2decb267-0305-4e3e-b505-cc929640d7a8" containerName="watcher-applier" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768855 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2decb267-0305-4e3e-b505-cc929640d7a8" containerName="watcher-applier" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768870 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820c24eb-8503-4e7f-910e-98d300034b0e" containerName="mariadb-account-delete" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768879 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="820c24eb-8503-4e7f-910e-98d300034b0e" containerName="mariadb-account-delete" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768894 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="proxy-httpd" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768902 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="proxy-httpd" Mar 14 09:44:26 crc kubenswrapper[4956]: E0314 09:44:26.768916 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-central-agent" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.768923 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-central-agent" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769110 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="proxy-httpd" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769126 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="820c24eb-8503-4e7f-910e-98d300034b0e" containerName="mariadb-account-delete" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769143 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="sg-core" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769157 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="92102f18-f0db-441b-a147-a8fab678a887" containerName="watcher-decision-engine" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769169 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2decb267-0305-4e3e-b505-cc929640d7a8" containerName="watcher-applier" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769180 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-notification-agent" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.769194 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" containerName="ceilometer-central-agent" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.771041 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.773385 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.773660 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.773865 4956 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.800677 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.853764 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-config-data\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.853929 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.854249 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f4eb2d1-4293-4a16-984e-f3aaae12b369-run-httpd\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.854424 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twgv\" (UniqueName: \"kubernetes.io/projected/3f4eb2d1-4293-4a16-984e-f3aaae12b369-kube-api-access-7twgv\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.854472 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-scripts\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.854520 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.854615 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.854711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f4eb2d1-4293-4a16-984e-f3aaae12b369-log-httpd\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.955953 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f4eb2d1-4293-4a16-984e-f3aaae12b369-run-httpd\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956051 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twgv\" (UniqueName: \"kubernetes.io/projected/3f4eb2d1-4293-4a16-984e-f3aaae12b369-kube-api-access-7twgv\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956080 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-scripts\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956103 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956141 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956170 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f4eb2d1-4293-4a16-984e-f3aaae12b369-log-httpd\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956199 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-config-data\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956539 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f4eb2d1-4293-4a16-984e-f3aaae12b369-run-httpd\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.956874 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f4eb2d1-4293-4a16-984e-f3aaae12b369-log-httpd\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.969300 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.969352 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-scripts\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.969358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.969427 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.970110 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4eb2d1-4293-4a16-984e-f3aaae12b369-config-data\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:26 crc kubenswrapper[4956]: I0314 09:44:26.977930 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twgv\" (UniqueName: \"kubernetes.io/projected/3f4eb2d1-4293-4a16-984e-f3aaae12b369-kube-api-access-7twgv\") pod \"ceilometer-0\" (UID: \"3f4eb2d1-4293-4a16-984e-f3aaae12b369\") " pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:27 crc kubenswrapper[4956]: I0314 09:44:27.100115 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:27 crc kubenswrapper[4956]: I0314 09:44:27.242893 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d71f38a-1f62-4195-8b0f-33ea24cd04b4" path="/var/lib/kubelet/pods/1d71f38a-1f62-4195-8b0f-33ea24cd04b4/volumes" Mar 14 09:44:27 crc kubenswrapper[4956]: I0314 09:44:27.243828 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc3d756-2060-4570-a9e8-693e99a932b9" path="/var/lib/kubelet/pods/7dc3d756-2060-4570-a9e8-693e99a932b9/volumes" Mar 14 09:44:27 crc kubenswrapper[4956]: I0314 09:44:27.244449 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820c24eb-8503-4e7f-910e-98d300034b0e" path="/var/lib/kubelet/pods/820c24eb-8503-4e7f-910e-98d300034b0e/volumes" Mar 14 09:44:27 crc kubenswrapper[4956]: I0314 09:44:27.245415 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f685c4cb-19e7-4b54-a594-29001444b634" path="/var/lib/kubelet/pods/f685c4cb-19e7-4b54-a594-29001444b634/volumes" Mar 14 09:44:27 crc kubenswrapper[4956]: I0314 09:44:27.527440 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Mar 14 09:44:28 crc kubenswrapper[4956]: I0314 09:44:28.436523 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3f4eb2d1-4293-4a16-984e-f3aaae12b369","Type":"ContainerStarted","Data":"d5086c8d0fbda71bac0e1bff9d0d848679fd3113ba06af4778d816393a84bb14"} Mar 14 09:44:28 crc kubenswrapper[4956]: I0314 09:44:28.436859 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3f4eb2d1-4293-4a16-984e-f3aaae12b369","Type":"ContainerStarted","Data":"cd1acf7dbbbe8103ef185d49716b5050e62161cbef849a3a2ccb4465fe937e9b"} Mar 14 09:44:29 crc kubenswrapper[4956]: I0314 09:44:29.450245 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3f4eb2d1-4293-4a16-984e-f3aaae12b369","Type":"ContainerStarted","Data":"c69f0491fb033774959243d32f29835ff3ae1ca3a861253af8b24f20ddd2c2b3"} Mar 14 09:44:29 crc kubenswrapper[4956]: I0314 09:44:29.454051 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3f4eb2d1-4293-4a16-984e-f3aaae12b369","Type":"ContainerStarted","Data":"32fa8a67512eaf2be8e0ddcec5d1e44eb98a1ded2cbd0a70e96f5fa3147816b3"} Mar 14 09:44:31 crc kubenswrapper[4956]: I0314 09:44:31.472149 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3f4eb2d1-4293-4a16-984e-f3aaae12b369","Type":"ContainerStarted","Data":"ff61dee40290dde7b9a8f11fe57ddb0a65ce820652aa7c188db2cde65682f943"} Mar 14 09:44:31 crc kubenswrapper[4956]: I0314 09:44:31.474180 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:31 crc kubenswrapper[4956]: I0314 09:44:31.496508 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.352612057 podStartE2EDuration="5.496468201s" podCreationTimestamp="2026-03-14 09:44:26 +0000 UTC" firstStartedPulling="2026-03-14 09:44:27.531270218 +0000 UTC m=+2873.043962486" lastFinishedPulling="2026-03-14 09:44:30.675126362 +0000 UTC m=+2876.187818630" observedRunningTime="2026-03-14 09:44:31.492645574 +0000 UTC m=+2877.005337852" watchObservedRunningTime="2026-03-14 09:44:31.496468201 +0000 UTC m=+2877.009160469" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.245653 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mjvsv"] Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.247766 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.256353 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mjvsv"] Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.298764 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-catalog-content\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.298980 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftbph\" (UniqueName: \"kubernetes.io/projected/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-kube-api-access-ftbph\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.299119 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-utilities\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.400863 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-catalog-content\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.400984 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftbph\" (UniqueName: \"kubernetes.io/projected/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-kube-api-access-ftbph\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.401059 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-utilities\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.401477 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-catalog-content\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.401622 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-utilities\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.423100 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftbph\" (UniqueName: \"kubernetes.io/projected/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-kube-api-access-ftbph\") pod \"certified-operators-mjvsv\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:43 crc kubenswrapper[4956]: I0314 09:44:43.576815 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:44 crc kubenswrapper[4956]: I0314 09:44:44.038009 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mjvsv"] Mar 14 09:44:44 crc kubenswrapper[4956]: I0314 09:44:44.574662 4956 generic.go:334] "Generic (PLEG): container finished" podID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerID="8545cb665c39d68acc8e45e4fbf01bd046ed79fd90e7f1c619c3bdb6b6b1b4ff" exitCode=0 Mar 14 09:44:44 crc kubenswrapper[4956]: I0314 09:44:44.574709 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerDied","Data":"8545cb665c39d68acc8e45e4fbf01bd046ed79fd90e7f1c619c3bdb6b6b1b4ff"} Mar 14 09:44:44 crc kubenswrapper[4956]: I0314 09:44:44.575083 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerStarted","Data":"7079bb57a8d6b587daae0e3c18808e9575690fbfdf5f640cce7521fdd538a0e1"} Mar 14 09:44:45 crc kubenswrapper[4956]: I0314 09:44:45.586513 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerStarted","Data":"cc04393c5d6ec653b477c4b12dc5a4c03588593b0c4a54ab6d49f50879669547"} Mar 14 09:44:46 crc kubenswrapper[4956]: I0314 09:44:46.597336 4956 generic.go:334] "Generic (PLEG): container finished" podID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerID="cc04393c5d6ec653b477c4b12dc5a4c03588593b0c4a54ab6d49f50879669547" exitCode=0 Mar 14 09:44:46 crc kubenswrapper[4956]: I0314 09:44:46.597400 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerDied","Data":"cc04393c5d6ec653b477c4b12dc5a4c03588593b0c4a54ab6d49f50879669547"} Mar 14 09:44:46 crc kubenswrapper[4956]: I0314 09:44:46.597825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerStarted","Data":"3a145909b7cd492e5239f20724e50b1b8c0e70275506b8285d2574978184f5b4"} Mar 14 09:44:46 crc kubenswrapper[4956]: I0314 09:44:46.622394 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mjvsv" podStartSLOduration=2.240623227 podStartE2EDuration="3.622371272s" podCreationTimestamp="2026-03-14 09:44:43 +0000 UTC" firstStartedPulling="2026-03-14 09:44:44.580607322 +0000 UTC m=+2890.093299590" lastFinishedPulling="2026-03-14 09:44:45.962355367 +0000 UTC m=+2891.475047635" observedRunningTime="2026-03-14 09:44:46.616608376 +0000 UTC m=+2892.129300644" watchObservedRunningTime="2026-03-14 09:44:46.622371272 +0000 UTC m=+2892.135063540" Mar 14 09:44:53 crc kubenswrapper[4956]: I0314 09:44:53.577789 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:53 crc kubenswrapper[4956]: I0314 09:44:53.578307 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:53 crc kubenswrapper[4956]: I0314 09:44:53.624736 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:53 crc kubenswrapper[4956]: I0314 09:44:53.693020 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:55 crc kubenswrapper[4956]: I0314 09:44:55.423329 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:44:55 crc kubenswrapper[4956]: I0314 09:44:55.424305 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:44:57 crc kubenswrapper[4956]: I0314 09:44:57.107373 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Mar 14 09:44:57 crc kubenswrapper[4956]: I0314 09:44:57.235506 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mjvsv"] Mar 14 09:44:57 crc kubenswrapper[4956]: I0314 09:44:57.235773 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mjvsv" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="registry-server" containerID="cri-o://3a145909b7cd492e5239f20724e50b1b8c0e70275506b8285d2574978184f5b4" gracePeriod=2 Mar 14 09:44:57 crc kubenswrapper[4956]: I0314 09:44:57.682784 4956 generic.go:334] "Generic (PLEG): container finished" podID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerID="3a145909b7cd492e5239f20724e50b1b8c0e70275506b8285d2574978184f5b4" exitCode=0 Mar 14 09:44:57 crc kubenswrapper[4956]: I0314 09:44:57.682871 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerDied","Data":"3a145909b7cd492e5239f20724e50b1b8c0e70275506b8285d2574978184f5b4"} Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.006226 4956 scope.go:117] "RemoveContainer" containerID="79d640ecd201324b505f6908b2335494de33703b8d9eb138bb4afdbe84e8f1bc" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.190631 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.238877 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-utilities\") pod \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.238951 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-catalog-content\") pod \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.239053 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftbph\" (UniqueName: \"kubernetes.io/projected/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-kube-api-access-ftbph\") pod \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\" (UID: \"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066\") " Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.240691 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-utilities" (OuterVolumeSpecName: "utilities") pod "fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" (UID: "fe3e9fa1-d1af-4889-8c1c-e1631c0a1066"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.244752 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-kube-api-access-ftbph" (OuterVolumeSpecName: "kube-api-access-ftbph") pod "fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" (UID: "fe3e9fa1-d1af-4889-8c1c-e1631c0a1066"). InnerVolumeSpecName "kube-api-access-ftbph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.293858 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" (UID: "fe3e9fa1-d1af-4889-8c1c-e1631c0a1066"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.340644 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.340692 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.340706 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftbph\" (UniqueName: \"kubernetes.io/projected/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066-kube-api-access-ftbph\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.694866 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjvsv" event={"ID":"fe3e9fa1-d1af-4889-8c1c-e1631c0a1066","Type":"ContainerDied","Data":"7079bb57a8d6b587daae0e3c18808e9575690fbfdf5f640cce7521fdd538a0e1"} Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.694964 4956 scope.go:117] "RemoveContainer" containerID="3a145909b7cd492e5239f20724e50b1b8c0e70275506b8285d2574978184f5b4" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.694956 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjvsv" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.717190 4956 scope.go:117] "RemoveContainer" containerID="cc04393c5d6ec653b477c4b12dc5a4c03588593b0c4a54ab6d49f50879669547" Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.739931 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mjvsv"] Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.752695 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mjvsv"] Mar 14 09:44:58 crc kubenswrapper[4956]: I0314 09:44:58.758972 4956 scope.go:117] "RemoveContainer" containerID="8545cb665c39d68acc8e45e4fbf01bd046ed79fd90e7f1c619c3bdb6b6b1b4ff" Mar 14 09:44:59 crc kubenswrapper[4956]: I0314 09:44:59.222726 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" path="/var/lib/kubelet/pods/fe3e9fa1-d1af-4889-8c1c-e1631c0a1066/volumes" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.038417 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nc79n/must-gather-lnld2"] Mar 14 09:45:00 crc kubenswrapper[4956]: E0314 09:45:00.038739 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="extract-content" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.038750 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="extract-content" Mar 14 09:45:00 crc kubenswrapper[4956]: E0314 09:45:00.038762 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="registry-server" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.038769 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="registry-server" Mar 14 09:45:00 crc kubenswrapper[4956]: E0314 09:45:00.038782 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="extract-utilities" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.038789 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="extract-utilities" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.038937 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe3e9fa1-d1af-4889-8c1c-e1631c0a1066" containerName="registry-server" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.039740 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.045843 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nc79n"/"openshift-service-ca.crt" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.047029 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nc79n"/"kube-root-ca.crt" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.071632 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2npsq\" (UniqueName: \"kubernetes.io/projected/7e68c8bc-ec3d-447e-ba56-dda97420dd80-kube-api-access-2npsq\") pod \"must-gather-lnld2\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.071982 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e68c8bc-ec3d-447e-ba56-dda97420dd80-must-gather-output\") pod \"must-gather-lnld2\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.122153 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nc79n/must-gather-lnld2"] Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.173323 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2npsq\" (UniqueName: \"kubernetes.io/projected/7e68c8bc-ec3d-447e-ba56-dda97420dd80-kube-api-access-2npsq\") pod \"must-gather-lnld2\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.173432 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e68c8bc-ec3d-447e-ba56-dda97420dd80-must-gather-output\") pod \"must-gather-lnld2\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.173899 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e68c8bc-ec3d-447e-ba56-dda97420dd80-must-gather-output\") pod \"must-gather-lnld2\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.179444 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss"] Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.181172 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.183642 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.183663 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.193594 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss"] Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.205901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2npsq\" (UniqueName: \"kubernetes.io/projected/7e68c8bc-ec3d-447e-ba56-dda97420dd80-kube-api-access-2npsq\") pod \"must-gather-lnld2\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.274811 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7c84050-1abe-4cde-be35-a45eff902d68-config-volume\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.275033 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gz2r\" (UniqueName: \"kubernetes.io/projected/f7c84050-1abe-4cde-be35-a45eff902d68-kube-api-access-7gz2r\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.275388 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7c84050-1abe-4cde-be35-a45eff902d68-secret-volume\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.357570 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.376991 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gz2r\" (UniqueName: \"kubernetes.io/projected/f7c84050-1abe-4cde-be35-a45eff902d68-kube-api-access-7gz2r\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.377057 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7c84050-1abe-4cde-be35-a45eff902d68-secret-volume\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.377212 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7c84050-1abe-4cde-be35-a45eff902d68-config-volume\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.378391 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7c84050-1abe-4cde-be35-a45eff902d68-config-volume\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.388695 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7c84050-1abe-4cde-be35-a45eff902d68-secret-volume\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.409233 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gz2r\" (UniqueName: \"kubernetes.io/projected/f7c84050-1abe-4cde-be35-a45eff902d68-kube-api-access-7gz2r\") pod \"collect-profiles-29558025-lsgss\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.506794 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:00 crc kubenswrapper[4956]: W0314 09:45:00.856509 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c84050_1abe_4cde_be35_a45eff902d68.slice/crio-ea535d782c10b5eacf1dbdaabf8b273af06560156b985066e9f823957531a057 WatchSource:0}: Error finding container ea535d782c10b5eacf1dbdaabf8b273af06560156b985066e9f823957531a057: Status 404 returned error can't find the container with id ea535d782c10b5eacf1dbdaabf8b273af06560156b985066e9f823957531a057 Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.857329 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss"] Mar 14 09:45:00 crc kubenswrapper[4956]: I0314 09:45:00.885336 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nc79n/must-gather-lnld2"] Mar 14 09:45:00 crc kubenswrapper[4956]: W0314 09:45:00.892081 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e68c8bc_ec3d_447e_ba56_dda97420dd80.slice/crio-e7e223a30c4d9415d97b7dcb346d3d343b75811defc034396ca21944ca14a015 WatchSource:0}: Error finding container e7e223a30c4d9415d97b7dcb346d3d343b75811defc034396ca21944ca14a015: Status 404 returned error can't find the container with id e7e223a30c4d9415d97b7dcb346d3d343b75811defc034396ca21944ca14a015 Mar 14 09:45:01 crc kubenswrapper[4956]: I0314 09:45:01.734452 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7c84050-1abe-4cde-be35-a45eff902d68" containerID="26687c47812ed43c529fe46bbd33da4ad44421cc5d3b8d087b0361b4f4087000" exitCode=0 Mar 14 09:45:01 crc kubenswrapper[4956]: I0314 09:45:01.734621 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" event={"ID":"f7c84050-1abe-4cde-be35-a45eff902d68","Type":"ContainerDied","Data":"26687c47812ed43c529fe46bbd33da4ad44421cc5d3b8d087b0361b4f4087000"} Mar 14 09:45:01 crc kubenswrapper[4956]: I0314 09:45:01.734760 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" event={"ID":"f7c84050-1abe-4cde-be35-a45eff902d68","Type":"ContainerStarted","Data":"ea535d782c10b5eacf1dbdaabf8b273af06560156b985066e9f823957531a057"} Mar 14 09:45:01 crc kubenswrapper[4956]: I0314 09:45:01.738265 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nc79n/must-gather-lnld2" event={"ID":"7e68c8bc-ec3d-447e-ba56-dda97420dd80","Type":"ContainerStarted","Data":"e7e223a30c4d9415d97b7dcb346d3d343b75811defc034396ca21944ca14a015"} Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.056687 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.128507 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7c84050-1abe-4cde-be35-a45eff902d68-secret-volume\") pod \"f7c84050-1abe-4cde-be35-a45eff902d68\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.128644 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7c84050-1abe-4cde-be35-a45eff902d68-config-volume\") pod \"f7c84050-1abe-4cde-be35-a45eff902d68\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.128735 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gz2r\" (UniqueName: \"kubernetes.io/projected/f7c84050-1abe-4cde-be35-a45eff902d68-kube-api-access-7gz2r\") pod \"f7c84050-1abe-4cde-be35-a45eff902d68\" (UID: \"f7c84050-1abe-4cde-be35-a45eff902d68\") " Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.129370 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c84050-1abe-4cde-be35-a45eff902d68-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7c84050-1abe-4cde-be35-a45eff902d68" (UID: "f7c84050-1abe-4cde-be35-a45eff902d68"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.132852 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c84050-1abe-4cde-be35-a45eff902d68-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7c84050-1abe-4cde-be35-a45eff902d68" (UID: "f7c84050-1abe-4cde-be35-a45eff902d68"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.133008 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c84050-1abe-4cde-be35-a45eff902d68-kube-api-access-7gz2r" (OuterVolumeSpecName: "kube-api-access-7gz2r") pod "f7c84050-1abe-4cde-be35-a45eff902d68" (UID: "f7c84050-1abe-4cde-be35-a45eff902d68"). InnerVolumeSpecName "kube-api-access-7gz2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.230444 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7c84050-1abe-4cde-be35-a45eff902d68-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.230494 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gz2r\" (UniqueName: \"kubernetes.io/projected/f7c84050-1abe-4cde-be35-a45eff902d68-kube-api-access-7gz2r\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.230538 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7c84050-1abe-4cde-be35-a45eff902d68-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.777879 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" event={"ID":"f7c84050-1abe-4cde-be35-a45eff902d68","Type":"ContainerDied","Data":"ea535d782c10b5eacf1dbdaabf8b273af06560156b985066e9f823957531a057"} Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.777915 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-lsgss" Mar 14 09:45:03 crc kubenswrapper[4956]: I0314 09:45:03.777919 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea535d782c10b5eacf1dbdaabf8b273af06560156b985066e9f823957531a057" Mar 14 09:45:04 crc kubenswrapper[4956]: I0314 09:45:04.130316 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp"] Mar 14 09:45:04 crc kubenswrapper[4956]: I0314 09:45:04.137187 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-rlrhp"] Mar 14 09:45:05 crc kubenswrapper[4956]: I0314 09:45:05.230011 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35335b0c-87c6-40c0-9362-0252727eebee" path="/var/lib/kubelet/pods/35335b0c-87c6-40c0-9362-0252727eebee/volumes" Mar 14 09:45:07 crc kubenswrapper[4956]: I0314 09:45:07.812410 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nc79n/must-gather-lnld2" event={"ID":"7e68c8bc-ec3d-447e-ba56-dda97420dd80","Type":"ContainerStarted","Data":"28655a9a0e2fbe10e7aa8c194f7796c43f24b4d359a5cd672a7c1a8f2db09f70"} Mar 14 09:45:07 crc kubenswrapper[4956]: I0314 09:45:07.813644 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nc79n/must-gather-lnld2" event={"ID":"7e68c8bc-ec3d-447e-ba56-dda97420dd80","Type":"ContainerStarted","Data":"03fc8db1f82412c77d8985eb52e625b4148ef5617d5690f935be8b4ef51e51a9"} Mar 14 09:45:07 crc kubenswrapper[4956]: I0314 09:45:07.833695 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nc79n/must-gather-lnld2" podStartSLOduration=2.475147101 podStartE2EDuration="8.83367418s" podCreationTimestamp="2026-03-14 09:44:59 +0000 UTC" firstStartedPulling="2026-03-14 09:45:00.893877448 +0000 UTC m=+2906.406569716" lastFinishedPulling="2026-03-14 09:45:07.252404527 +0000 UTC m=+2912.765096795" observedRunningTime="2026-03-14 09:45:07.826855307 +0000 UTC m=+2913.339547585" watchObservedRunningTime="2026-03-14 09:45:07.83367418 +0000 UTC m=+2913.346366448" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.246406 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cbrsw"] Mar 14 09:45:12 crc kubenswrapper[4956]: E0314 09:45:12.247045 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c84050-1abe-4cde-be35-a45eff902d68" containerName="collect-profiles" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.247057 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c84050-1abe-4cde-be35-a45eff902d68" containerName="collect-profiles" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.247216 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c84050-1abe-4cde-be35-a45eff902d68" containerName="collect-profiles" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.248382 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.260344 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cbrsw"] Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.276043 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-262g8\" (UniqueName: \"kubernetes.io/projected/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-kube-api-access-262g8\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.276157 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-utilities\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.276220 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-catalog-content\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.377931 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-262g8\" (UniqueName: \"kubernetes.io/projected/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-kube-api-access-262g8\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.378002 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-utilities\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.378045 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-catalog-content\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.385880 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-catalog-content\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.386018 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-utilities\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.401744 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-262g8\" (UniqueName: \"kubernetes.io/projected/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-kube-api-access-262g8\") pod \"community-operators-cbrsw\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:12 crc kubenswrapper[4956]: I0314 09:45:12.566908 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:13 crc kubenswrapper[4956]: I0314 09:45:13.075663 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cbrsw"] Mar 14 09:45:13 crc kubenswrapper[4956]: I0314 09:45:13.867854 4956 generic.go:334] "Generic (PLEG): container finished" podID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerID="023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653" exitCode=0 Mar 14 09:45:13 crc kubenswrapper[4956]: I0314 09:45:13.867952 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerDied","Data":"023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653"} Mar 14 09:45:13 crc kubenswrapper[4956]: I0314 09:45:13.868134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerStarted","Data":"c20ac170245e244e29d9dcb9311099c5ad304290a563f3ce0ceb7de9b3248e96"} Mar 14 09:45:14 crc kubenswrapper[4956]: I0314 09:45:14.877709 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerStarted","Data":"fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f"} Mar 14 09:45:15 crc kubenswrapper[4956]: E0314 09:45:15.129397 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cf5a8fe_bb96_4ca1_9c53_5085d5ffdee0.slice/crio-fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cf5a8fe_bb96_4ca1_9c53_5085d5ffdee0.slice/crio-conmon-fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:45:15 crc kubenswrapper[4956]: I0314 09:45:15.887586 4956 generic.go:334] "Generic (PLEG): container finished" podID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerID="fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f" exitCode=0 Mar 14 09:45:15 crc kubenswrapper[4956]: I0314 09:45:15.887693 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerDied","Data":"fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f"} Mar 14 09:45:16 crc kubenswrapper[4956]: I0314 09:45:16.899758 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerStarted","Data":"f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54"} Mar 14 09:45:16 crc kubenswrapper[4956]: I0314 09:45:16.918852 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cbrsw" podStartSLOduration=2.488861229 podStartE2EDuration="4.918831443s" podCreationTimestamp="2026-03-14 09:45:12 +0000 UTC" firstStartedPulling="2026-03-14 09:45:13.869789859 +0000 UTC m=+2919.382482127" lastFinishedPulling="2026-03-14 09:45:16.299760073 +0000 UTC m=+2921.812452341" observedRunningTime="2026-03-14 09:45:16.914840312 +0000 UTC m=+2922.427532580" watchObservedRunningTime="2026-03-14 09:45:16.918831443 +0000 UTC m=+2922.431523711" Mar 14 09:45:22 crc kubenswrapper[4956]: I0314 09:45:22.567416 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:22 crc kubenswrapper[4956]: I0314 09:45:22.568241 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:22 crc kubenswrapper[4956]: I0314 09:45:22.625678 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:22 crc kubenswrapper[4956]: I0314 09:45:22.999827 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:25 crc kubenswrapper[4956]: I0314 09:45:25.424081 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:45:25 crc kubenswrapper[4956]: I0314 09:45:25.424685 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.231074 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cbrsw"] Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.231601 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cbrsw" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="registry-server" containerID="cri-o://f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54" gracePeriod=2 Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.599673 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.789840 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-262g8\" (UniqueName: \"kubernetes.io/projected/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-kube-api-access-262g8\") pod \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.790311 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-utilities\") pod \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.790397 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-catalog-content\") pod \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\" (UID: \"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0\") " Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.791176 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-utilities" (OuterVolumeSpecName: "utilities") pod "5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" (UID: "5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.807892 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-kube-api-access-262g8" (OuterVolumeSpecName: "kube-api-access-262g8") pod "5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" (UID: "5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0"). InnerVolumeSpecName "kube-api-access-262g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.840115 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" (UID: "5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.891871 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.891910 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.891925 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-262g8\" (UniqueName: \"kubernetes.io/projected/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0-kube-api-access-262g8\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.980981 4956 generic.go:334] "Generic (PLEG): container finished" podID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerID="f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54" exitCode=0 Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.981025 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerDied","Data":"f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54"} Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.981050 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cbrsw" event={"ID":"5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0","Type":"ContainerDied","Data":"c20ac170245e244e29d9dcb9311099c5ad304290a563f3ce0ceb7de9b3248e96"} Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.981069 4956 scope.go:117] "RemoveContainer" containerID="f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54" Mar 14 09:45:26 crc kubenswrapper[4956]: I0314 09:45:26.981185 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cbrsw" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.005685 4956 scope.go:117] "RemoveContainer" containerID="fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.009800 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cbrsw"] Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.016699 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cbrsw"] Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.028606 4956 scope.go:117] "RemoveContainer" containerID="023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.057827 4956 scope.go:117] "RemoveContainer" containerID="f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54" Mar 14 09:45:27 crc kubenswrapper[4956]: E0314 09:45:27.058360 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54\": container with ID starting with f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54 not found: ID does not exist" containerID="f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.058400 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54"} err="failed to get container status \"f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54\": rpc error: code = NotFound desc = could not find container \"f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54\": container with ID starting with f4616cd78e813a0153bedf943c8b4bc0decf11407b13d7e61da0cbfddad29f54 not found: ID does not exist" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.058426 4956 scope.go:117] "RemoveContainer" containerID="fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f" Mar 14 09:45:27 crc kubenswrapper[4956]: E0314 09:45:27.058944 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f\": container with ID starting with fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f not found: ID does not exist" containerID="fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.059180 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f"} err="failed to get container status \"fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f\": rpc error: code = NotFound desc = could not find container \"fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f\": container with ID starting with fc337d8b89617e762855ded4503f4edb71d7a923eee674c11c96d431eb849b2f not found: ID does not exist" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.059353 4956 scope.go:117] "RemoveContainer" containerID="023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653" Mar 14 09:45:27 crc kubenswrapper[4956]: E0314 09:45:27.059863 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653\": container with ID starting with 023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653 not found: ID does not exist" containerID="023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.059882 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653"} err="failed to get container status \"023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653\": rpc error: code = NotFound desc = could not find container \"023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653\": container with ID starting with 023e40c148a7080b432e69dc9fdcde43ed3da73ca49449f2c70207f36af09653 not found: ID does not exist" Mar 14 09:45:27 crc kubenswrapper[4956]: I0314 09:45:27.219861 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" path="/var/lib/kubelet/pods/5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0/volumes" Mar 14 09:45:55 crc kubenswrapper[4956]: I0314 09:45:55.424636 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:45:55 crc kubenswrapper[4956]: I0314 09:45:55.425290 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:45:55 crc kubenswrapper[4956]: I0314 09:45:55.425351 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" Mar 14 09:45:55 crc kubenswrapper[4956]: I0314 09:45:55.426200 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634"} pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:45:55 crc kubenswrapper[4956]: I0314 09:45:55.426263 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" containerID="cri-o://41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" gracePeriod=600 Mar 14 09:45:55 crc kubenswrapper[4956]: E0314 09:45:55.547341 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:45:56 crc kubenswrapper[4956]: I0314 09:45:56.234460 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ba20367-e506-422e-a846-eb1525cb3b94" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" exitCode=0 Mar 14 09:45:56 crc kubenswrapper[4956]: I0314 09:45:56.234779 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerDied","Data":"41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634"} Mar 14 09:45:56 crc kubenswrapper[4956]: I0314 09:45:56.234808 4956 scope.go:117] "RemoveContainer" containerID="782b831bfebf5d160529d876878bf15ad46c206265675567c1df2cedcbdb4339" Mar 14 09:45:56 crc kubenswrapper[4956]: I0314 09:45:56.235266 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:45:56 crc kubenswrapper[4956]: E0314 09:45:56.235460 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.192092 4956 scope.go:117] "RemoveContainer" containerID="a16980b04fdbf3bd3321ab84daf5d54fcd210c98b6326a7aa32ab71a67cfd38d" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.217680 4956 scope.go:117] "RemoveContainer" containerID="1e8530750c0ec3c127d0b7351bf4a3de4bd6e1db982aeb18acd933b9d4a2b560" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.251300 4956 scope.go:117] "RemoveContainer" containerID="b41f3f37b6aee35504110174cec3dcd32758f8a3ee7135728a5f899caa3cf3a9" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.292508 4956 scope.go:117] "RemoveContainer" containerID="e0840ba7ec5adfad0d3d833b4d5dacab1ecf9ac5b6f96c5448baeece09c17f9d" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.334709 4956 scope.go:117] "RemoveContainer" containerID="64b35f687efc8d5a793a1cb1bf645014be3d132c644f91aa6e9356e8c8b6ad25" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.379616 4956 scope.go:117] "RemoveContainer" containerID="f1e9183ff1f8807da241d845cb7718f58740fbdbfef022c5be5d726cad64f39d" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.399943 4956 scope.go:117] "RemoveContainer" containerID="8018ec2c9894974394dd592ff406206f4e5a3f8fa4d1c765f05346af4117cd40" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.419757 4956 scope.go:117] "RemoveContainer" containerID="45755657750e408ef6f018008d7c4072e4d556849bdb1af79c267402e4ff3a39" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.441084 4956 scope.go:117] "RemoveContainer" containerID="6f260e8f33b2163b744feb503a20b7d1d3d7dbf980ad0e5ed426cf64bc4dbac8" Mar 14 09:45:58 crc kubenswrapper[4956]: I0314 09:45:58.475965 4956 scope.go:117] "RemoveContainer" containerID="06c669332687ddd7bdfba4b48df4ca21c46d7b7a73dd332109daf5ff47a1c668" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.141533 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558026-9c6gt"] Mar 14 09:46:00 crc kubenswrapper[4956]: E0314 09:46:00.142300 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="extract-utilities" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.142313 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="extract-utilities" Mar 14 09:46:00 crc kubenswrapper[4956]: E0314 09:46:00.142324 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="extract-content" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.142330 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="extract-content" Mar 14 09:46:00 crc kubenswrapper[4956]: E0314 09:46:00.142345 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="registry-server" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.142352 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="registry-server" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.142564 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cf5a8fe-bb96-4ca1-9c53-5085d5ffdee0" containerName="registry-server" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.143180 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.148583 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.148743 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.151037 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-9c6gt"] Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.151196 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.299823 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s697t\" (UniqueName: \"kubernetes.io/projected/864a9dee-9abe-458e-9fc7-151e1df7c41c-kube-api-access-s697t\") pod \"auto-csr-approver-29558026-9c6gt\" (UID: \"864a9dee-9abe-458e-9fc7-151e1df7c41c\") " pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.400855 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s697t\" (UniqueName: \"kubernetes.io/projected/864a9dee-9abe-458e-9fc7-151e1df7c41c-kube-api-access-s697t\") pod \"auto-csr-approver-29558026-9c6gt\" (UID: \"864a9dee-9abe-458e-9fc7-151e1df7c41c\") " pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.427454 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s697t\" (UniqueName: \"kubernetes.io/projected/864a9dee-9abe-458e-9fc7-151e1df7c41c-kube-api-access-s697t\") pod \"auto-csr-approver-29558026-9c6gt\" (UID: \"864a9dee-9abe-458e-9fc7-151e1df7c41c\") " pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.467134 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:00 crc kubenswrapper[4956]: I0314 09:46:00.904263 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-9c6gt"] Mar 14 09:46:01 crc kubenswrapper[4956]: I0314 09:46:01.315131 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" event={"ID":"864a9dee-9abe-458e-9fc7-151e1df7c41c","Type":"ContainerStarted","Data":"a67b30f79f9475c44a947f3e8003b006a8212135ec9c50d0039ab081af7f8a12"} Mar 14 09:46:02 crc kubenswrapper[4956]: I0314 09:46:02.326965 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" event={"ID":"864a9dee-9abe-458e-9fc7-151e1df7c41c","Type":"ContainerStarted","Data":"632f5f42be523f6a37ba490f4b0f04516ae5ca9deead2e599ce669aa8e969ca4"} Mar 14 09:46:02 crc kubenswrapper[4956]: I0314 09:46:02.356660 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" podStartSLOduration=1.393173207 podStartE2EDuration="2.356625953s" podCreationTimestamp="2026-03-14 09:46:00 +0000 UTC" firstStartedPulling="2026-03-14 09:46:00.917472657 +0000 UTC m=+2966.430164925" lastFinishedPulling="2026-03-14 09:46:01.880925413 +0000 UTC m=+2967.393617671" observedRunningTime="2026-03-14 09:46:02.343219714 +0000 UTC m=+2967.855911992" watchObservedRunningTime="2026-03-14 09:46:02.356625953 +0000 UTC m=+2967.869318231" Mar 14 09:46:03 crc kubenswrapper[4956]: I0314 09:46:03.338335 4956 generic.go:334] "Generic (PLEG): container finished" podID="864a9dee-9abe-458e-9fc7-151e1df7c41c" containerID="632f5f42be523f6a37ba490f4b0f04516ae5ca9deead2e599ce669aa8e969ca4" exitCode=0 Mar 14 09:46:03 crc kubenswrapper[4956]: I0314 09:46:03.338571 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" event={"ID":"864a9dee-9abe-458e-9fc7-151e1df7c41c","Type":"ContainerDied","Data":"632f5f42be523f6a37ba490f4b0f04516ae5ca9deead2e599ce669aa8e969ca4"} Mar 14 09:46:04 crc kubenswrapper[4956]: I0314 09:46:04.649924 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:04 crc kubenswrapper[4956]: I0314 09:46:04.776215 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s697t\" (UniqueName: \"kubernetes.io/projected/864a9dee-9abe-458e-9fc7-151e1df7c41c-kube-api-access-s697t\") pod \"864a9dee-9abe-458e-9fc7-151e1df7c41c\" (UID: \"864a9dee-9abe-458e-9fc7-151e1df7c41c\") " Mar 14 09:46:04 crc kubenswrapper[4956]: I0314 09:46:04.786838 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864a9dee-9abe-458e-9fc7-151e1df7c41c-kube-api-access-s697t" (OuterVolumeSpecName: "kube-api-access-s697t") pod "864a9dee-9abe-458e-9fc7-151e1df7c41c" (UID: "864a9dee-9abe-458e-9fc7-151e1df7c41c"). InnerVolumeSpecName "kube-api-access-s697t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:46:04 crc kubenswrapper[4956]: I0314 09:46:04.878756 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s697t\" (UniqueName: \"kubernetes.io/projected/864a9dee-9abe-458e-9fc7-151e1df7c41c-kube-api-access-s697t\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:05 crc kubenswrapper[4956]: I0314 09:46:05.357502 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" event={"ID":"864a9dee-9abe-458e-9fc7-151e1df7c41c","Type":"ContainerDied","Data":"a67b30f79f9475c44a947f3e8003b006a8212135ec9c50d0039ab081af7f8a12"} Mar 14 09:46:05 crc kubenswrapper[4956]: I0314 09:46:05.357582 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67b30f79f9475c44a947f3e8003b006a8212135ec9c50d0039ab081af7f8a12" Mar 14 09:46:05 crc kubenswrapper[4956]: I0314 09:46:05.357605 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-9c6gt" Mar 14 09:46:05 crc kubenswrapper[4956]: I0314 09:46:05.432503 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-jbvd9"] Mar 14 09:46:05 crc kubenswrapper[4956]: I0314 09:46:05.443522 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-jbvd9"] Mar 14 09:46:07 crc kubenswrapper[4956]: I0314 09:46:07.223731 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5002f91d-89aa-4683-bd3f-118a91b6459c" path="/var/lib/kubelet/pods/5002f91d-89aa-4683-bd3f-118a91b6459c/volumes" Mar 14 09:46:08 crc kubenswrapper[4956]: I0314 09:46:08.209822 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:46:08 crc kubenswrapper[4956]: E0314 09:46:08.210398 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:46:12 crc kubenswrapper[4956]: I0314 09:46:12.329027 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/util/0.log" Mar 14 09:46:12 crc kubenswrapper[4956]: I0314 09:46:12.725508 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/util/0.log" Mar 14 09:46:12 crc kubenswrapper[4956]: I0314 09:46:12.726701 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/pull/0.log" Mar 14 09:46:12 crc kubenswrapper[4956]: I0314 09:46:12.773031 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/pull/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.008260 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/util/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.048753 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/extract/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.058499 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_62d0bed58019ccc1d3626e895c0dbede7faaf3676e025b9f97c0e0a616vmnnb_aec90a2d-58f3-4dee-90c3-60a8fc90bc56/pull/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.251469 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/util/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.409529 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/pull/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.432711 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/util/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.448963 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/pull/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.607902 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/pull/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.622058 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/extract/0.log" Mar 14 09:46:13 crc kubenswrapper[4956]: I0314 09:46:13.635824 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_632ec92be58b5de1327ed8930598ec28f26f424a3aad5e3b3c0e97c82cpbhtr_07597b31-3cd5-4c5c-8bb1-65366038ddbb/util/0.log" Mar 14 09:46:14 crc kubenswrapper[4956]: I0314 09:46:14.320513 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9c8c85cd7-8xwbp_6d1aed1b-6436-46ca-a824-59eafb8ca5d3/manager/0.log" Mar 14 09:46:14 crc kubenswrapper[4956]: I0314 09:46:14.509622 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-74d565fbd5-c924b_18021394-f27d-422e-a68c-24a19d74ceb8/manager/0.log" Mar 14 09:46:14 crc kubenswrapper[4956]: I0314 09:46:14.727299 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d6bd468b-db2v8_876c14aa-a86a-495f-a110-3ade7d8d69fb/manager/0.log" Mar 14 09:46:14 crc kubenswrapper[4956]: I0314 09:46:14.977083 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9475cdd7-586s5_4492ee98-efe4-49c3-8c14-86453a8e8714/manager/0.log" Mar 14 09:46:15 crc kubenswrapper[4956]: I0314 09:46:15.444202 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-cb6d66846-b5jzw_9bf45de7-ba46-4ce9-a7d7-fc26e253423b/manager/0.log" Mar 14 09:46:15 crc kubenswrapper[4956]: I0314 09:46:15.512207 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-fbfb5bd65-v7cqm_e43b6f19-e463-43ce-9efe-5cefa3b53682/manager/0.log" Mar 14 09:46:15 crc kubenswrapper[4956]: I0314 09:46:15.642809 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-bf6b7fd8c-56k2s_401a3c6d-db2a-435b-b7f5-08816736d895/manager/0.log" Mar 14 09:46:15 crc kubenswrapper[4956]: I0314 09:46:15.938382 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-68f8d496f8-4knlv_18ad86e0-d070-4de1-bd50-a93f9abdf715/manager/0.log" Mar 14 09:46:16 crc kubenswrapper[4956]: I0314 09:46:16.037525 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6f6f57b9b6-9c7lm_19ee7ede-7bda-46bc-8413-95262fa53969/manager/0.log" Mar 14 09:46:16 crc kubenswrapper[4956]: I0314 09:46:16.370374 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-744456f686-nnfnk_03bd3900-f5fa-476e-a91a-f492e4a424dc/manager/0.log" Mar 14 09:46:16 crc kubenswrapper[4956]: I0314 09:46:16.447899 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-645c9f6488-9bn7h_240f1b23-5499-4644-a442-9647e71a33d4/manager/0.log" Mar 14 09:46:16 crc kubenswrapper[4956]: I0314 09:46:16.685002 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-58ff56fcc7-9kjv4_fdc31f77-84df-4657-b5ec-a7fcd8b673e2/manager/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.057832 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7cf9f49d6-qwdtd_e0b8fc0f-4ffa-4c26-84e4-9613f5161286/manager/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.090973 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-xbdph_67182d33-3abc-4661-8614-94238efc9e45/manager/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.368670 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kgjkt_9ac17263-727b-4bdd-8217-920844d59367/registry-server/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.567394 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-848d74f969-6n97h_a8d7db47-dd9b-4785-8e80-d7d97a324225/manager/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.735787 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-b5c469fd-bffxq_bad25fda-3055-4fa2-8fd4-24980a88c7c6/manager/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.919635 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-59b5586c67-c9k5q_6bff6422-f245-457e-9ddb-28f957c9edac/manager/0.log" Mar 14 09:46:17 crc kubenswrapper[4956]: I0314 09:46:17.931037 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vppb7_3d61bf91-4992-47ad-8a53-e823a71d3f9c/operator/0.log" Mar 14 09:46:18 crc kubenswrapper[4956]: I0314 09:46:18.092184 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f7469dbc6-6djhw_24e7f369-d268-4aa7-89f9-1b4ce48fd197/manager/0.log" Mar 14 09:46:18 crc kubenswrapper[4956]: I0314 09:46:18.472788 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-r9dtz_f1b524ae-60fa-48a7-aa07-bf354bd2ff62/manager/0.log" Mar 14 09:46:18 crc kubenswrapper[4956]: I0314 09:46:18.513673 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6646df7cdb-tspwx_2c1f44c6-aae3-4c3c-933e-c87956fb0fe6/manager/0.log" Mar 14 09:46:18 crc kubenswrapper[4956]: I0314 09:46:18.790745 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-index-jl5p2_75e6f1a9-bb4c-496d-841c-849bf8a375d7/registry-server/0.log" Mar 14 09:46:19 crc kubenswrapper[4956]: I0314 09:46:19.001430 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-788cc4b948-xgdnd_75ec3cea-39db-4cc8-8065-17259b7dd1e4/manager/0.log" Mar 14 09:46:20 crc kubenswrapper[4956]: I0314 09:46:20.365916 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64768694d-m9dwm_b3749fc9-e22e-42b9-8865-68679f7d78f1/manager/0.log" Mar 14 09:46:21 crc kubenswrapper[4956]: I0314 09:46:21.209209 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:46:21 crc kubenswrapper[4956]: E0314 09:46:21.209549 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:46:33 crc kubenswrapper[4956]: I0314 09:46:33.209668 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:46:33 crc kubenswrapper[4956]: E0314 09:46:33.211191 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:46:38 crc kubenswrapper[4956]: I0314 09:46:38.604092 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d7wqv_1666768a-cdbb-4ab1-83d8-b1ad0444f167/control-plane-machine-set-operator/0.log" Mar 14 09:46:38 crc kubenswrapper[4956]: I0314 09:46:38.789724 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8gwm6_095068b1-bf13-43a2-a250-a0eaeb60c6ae/kube-rbac-proxy/0.log" Mar 14 09:46:38 crc kubenswrapper[4956]: I0314 09:46:38.858050 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8gwm6_095068b1-bf13-43a2-a250-a0eaeb60c6ae/machine-api-operator/0.log" Mar 14 09:46:44 crc kubenswrapper[4956]: I0314 09:46:44.209129 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:46:44 crc kubenswrapper[4956]: E0314 09:46:44.209926 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:46:51 crc kubenswrapper[4956]: I0314 09:46:51.084036 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-jmg2k_04227b43-3672-4848-84f8-275b2ec997d8/cert-manager-controller/0.log" Mar 14 09:46:51 crc kubenswrapper[4956]: I0314 09:46:51.280553 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-lmlng_2ab2a4c0-0a05-4ffa-9900-9af5dbe961df/cert-manager-cainjector/0.log" Mar 14 09:46:51 crc kubenswrapper[4956]: I0314 09:46:51.385315 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-hfb75_470dcdf7-6d52-4164-bcfe-c8922d387147/cert-manager-webhook/0.log" Mar 14 09:46:58 crc kubenswrapper[4956]: I0314 09:46:58.722535 4956 scope.go:117] "RemoveContainer" containerID="a1623505805caebc13b8d8fcee88b6e3251419e9e8f0f6b76d5704fab9553286" Mar 14 09:46:58 crc kubenswrapper[4956]: I0314 09:46:58.741453 4956 scope.go:117] "RemoveContainer" containerID="7b6468e56288278161e95b0db81979778da89c96a845c69ceac10eb420ef803f" Mar 14 09:46:58 crc kubenswrapper[4956]: I0314 09:46:58.808384 4956 scope.go:117] "RemoveContainer" containerID="10471dc0fc95c2ba4feb5a8f549497f00af5e7ff0dc32e67305663bce8757d5e" Mar 14 09:46:58 crc kubenswrapper[4956]: I0314 09:46:58.827871 4956 scope.go:117] "RemoveContainer" containerID="f7937fa0cab416169b6f01ff83e48f22f4d8b69c8b3932bdf4a026c6fb49ebf4" Mar 14 09:46:58 crc kubenswrapper[4956]: I0314 09:46:58.860844 4956 scope.go:117] "RemoveContainer" containerID="eab81e6bc17e803ff5dad9fae895f899217da9b8d2374071891cc93dd312a247" Mar 14 09:46:59 crc kubenswrapper[4956]: I0314 09:46:59.209703 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:46:59 crc kubenswrapper[4956]: E0314 09:46:59.209923 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:47:03 crc kubenswrapper[4956]: I0314 09:47:03.637758 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-6w69h_986e9786-a2ab-4b67-8b7c-923951ef0928/nmstate-console-plugin/0.log" Mar 14 09:47:03 crc kubenswrapper[4956]: I0314 09:47:03.868200 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jz8xl_b0216839-cf58-4b58-8156-531abde2cbd1/nmstate-handler/0.log" Mar 14 09:47:03 crc kubenswrapper[4956]: I0314 09:47:03.950062 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wggcs_6c93f252-354d-484f-bc01-705766542dd9/kube-rbac-proxy/0.log" Mar 14 09:47:04 crc kubenswrapper[4956]: I0314 09:47:04.049106 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wggcs_6c93f252-354d-484f-bc01-705766542dd9/nmstate-metrics/0.log" Mar 14 09:47:04 crc kubenswrapper[4956]: I0314 09:47:04.117214 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-scdwq_01314a74-350e-49d3-a090-a924ac031589/nmstate-operator/0.log" Mar 14 09:47:04 crc kubenswrapper[4956]: I0314 09:47:04.260733 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-wdzfb_e5ccd04c-58e5-49c2-9667-8cdaa37dd381/nmstate-webhook/0.log" Mar 14 09:47:12 crc kubenswrapper[4956]: I0314 09:47:12.209259 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:47:12 crc kubenswrapper[4956]: E0314 09:47:12.210086 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:47:17 crc kubenswrapper[4956]: I0314 09:47:17.914669 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-259ww_c9a78fe1-8b1a-4655-ad98-19c53622c2b1/prometheus-operator/0.log" Mar 14 09:47:18 crc kubenswrapper[4956]: I0314 09:47:18.057529 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_c888df81-af56-4ddd-8857-4952f199f288/prometheus-operator-admission-webhook/0.log" Mar 14 09:47:18 crc kubenswrapper[4956]: I0314 09:47:18.128152 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193/prometheus-operator-admission-webhook/0.log" Mar 14 09:47:18 crc kubenswrapper[4956]: I0314 09:47:18.266069 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-q6fzl_2dd43b3c-f9be-41a8-b1a6-11ab052283c7/operator/0.log" Mar 14 09:47:18 crc kubenswrapper[4956]: I0314 09:47:18.317641 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-hkwjk_0be4e37c-1708-4556-9ac0-e6daaf8fdadf/observability-ui-dashboards/0.log" Mar 14 09:47:18 crc kubenswrapper[4956]: I0314 09:47:18.446223 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2vqmj_50a4be30-e002-4c15-b3ef-b3048665261b/perses-operator/0.log" Mar 14 09:47:25 crc kubenswrapper[4956]: I0314 09:47:25.216398 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:47:25 crc kubenswrapper[4956]: E0314 09:47:25.217123 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:47:31 crc kubenswrapper[4956]: I0314 09:47:31.714875 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-pmz25_da0e2705-3765-4603-97ec-4ff1f5a2bf73/kube-rbac-proxy/0.log" Mar 14 09:47:31 crc kubenswrapper[4956]: I0314 09:47:31.834927 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-pmz25_da0e2705-3765-4603-97ec-4ff1f5a2bf73/controller/0.log" Mar 14 09:47:31 crc kubenswrapper[4956]: I0314 09:47:31.975762 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-frr-files/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.203433 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-reloader/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.212232 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-frr-files/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.248025 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-metrics/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.275491 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-reloader/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.662233 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-metrics/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.671240 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-reloader/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.742594 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-metrics/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.742773 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-frr-files/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.897958 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-metrics/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.900453 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-reloader/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.928120 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/controller/0.log" Mar 14 09:47:32 crc kubenswrapper[4956]: I0314 09:47:32.978798 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/cp-frr-files/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.060440 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/frr-metrics/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.136444 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/kube-rbac-proxy/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.206452 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/kube-rbac-proxy-frr/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.299822 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/reloader/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.455162 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-nd87q_1f771f81-34d4-4b3a-b2f9-1791c32f81fa/frr-k8s-webhook-server/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.704834 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6ccbfd57fd-wcxtt_955961ad-ed3c-4fca-9da0-d9f44684fee8/manager/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.836773 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-886f6977-6gxlb_6ead3e16-6e7d-4598-993b-17d3fce555f3/webhook-server/0.log" Mar 14 09:47:33 crc kubenswrapper[4956]: I0314 09:47:33.990266 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-npbfj_ce1b2f0a-e73a-4105-ba00-f91d243f6fd9/kube-rbac-proxy/0.log" Mar 14 09:47:34 crc kubenswrapper[4956]: I0314 09:47:34.312998 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-npbfj_ce1b2f0a-e73a-4105-ba00-f91d243f6fd9/speaker/0.log" Mar 14 09:47:34 crc kubenswrapper[4956]: I0314 09:47:34.593311 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-s58l8_a9d18aeb-d0f0-4312-945e-2eae3025fd59/frr/0.log" Mar 14 09:47:37 crc kubenswrapper[4956]: I0314 09:47:37.209829 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:47:37 crc kubenswrapper[4956]: E0314 09:47:37.210315 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:47:43 crc kubenswrapper[4956]: I0314 09:47:43.034155 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-tttxv"] Mar 14 09:47:43 crc kubenswrapper[4956]: I0314 09:47:43.039906 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-tttxv"] Mar 14 09:47:43 crc kubenswrapper[4956]: I0314 09:47:43.221967 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3721c946-1a7b-4a9d-8fb8-1455cfa063c8" path="/var/lib/kubelet/pods/3721c946-1a7b-4a9d-8fb8-1455cfa063c8/volumes" Mar 14 09:47:52 crc kubenswrapper[4956]: I0314 09:47:52.209907 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:47:52 crc kubenswrapper[4956]: E0314 09:47:52.210663 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:47:56 crc kubenswrapper[4956]: I0314 09:47:56.977640 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531/init-config-reloader/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.197760 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531/alertmanager/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.254741 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531/init-config-reloader/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.505100 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_3f4eb2d1-4293-4a16-984e-f3aaae12b369/ceilometer-central-agent/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.592664 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_f4bd2bca-f8c8-4cf0-a8b3-0bc6e5951531/config-reloader/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.670574 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_3f4eb2d1-4293-4a16-984e-f3aaae12b369/ceilometer-notification-agent/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.747164 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_3f4eb2d1-4293-4a16-984e-f3aaae12b369/proxy-httpd/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.840036 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_3f4eb2d1-4293-4a16-984e-f3aaae12b369/sg-core/0.log" Mar 14 09:47:57 crc kubenswrapper[4956]: I0314 09:47:57.943601 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-75c5988f99-28ht4_67a35faf-96ec-45eb-af07-2a248bca71e2/keystone-api/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.050695 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_kube-state-metrics-0_0e00a078-7836-4348-b513-0c8af77d837d/kube-state-metrics/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.293158 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_19027c02-7e5c-484e-ba9f-b9f9e7c4f81e/mysql-bootstrap/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.503381 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_19027c02-7e5c-484e-ba9f-b9f9e7c4f81e/galera/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.571262 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_19027c02-7e5c-484e-ba9f-b9f9e7c4f81e/mysql-bootstrap/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.749589 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstackclient_4fbb48a6-d5ed-423b-90e4-809c839c8675/openstackclient/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.801536 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_51fad570-b371-4486-9a35-bea145391f8f/init-config-reloader/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.988930 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_51fad570-b371-4486-9a35-bea145391f8f/init-config-reloader/0.log" Mar 14 09:47:58 crc kubenswrapper[4956]: I0314 09:47:58.994090 4956 scope.go:117] "RemoveContainer" containerID="262120b322fff85bf80f0b7b1393c99f22e6b06a0982bcc04f4ba407e0b1e385" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.019177 4956 scope.go:117] "RemoveContainer" containerID="e4536404463e2f6abe58c4069d46bcfe563f96914cb64b0fe01d5a4b2e4cba8e" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.030006 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_51fad570-b371-4486-9a35-bea145391f8f/config-reloader/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.057522 4956 scope.go:117] "RemoveContainer" containerID="fcd7cf09991007254a8a2dd7e1cf7fee84fd89807c5d9afdb30692408555c7a3" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.104580 4956 scope.go:117] "RemoveContainer" containerID="a0ab92fa2aa3182014847d37e3fd318b21a5bb2f1c478fafac83ac6c21510369" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.112885 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_51fad570-b371-4486-9a35-bea145391f8f/prometheus/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.140021 4956 scope.go:117] "RemoveContainer" containerID="03c0b99550487f4685aa386603445d10f61ee3410b31f7b8217e2105527bfe83" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.182758 4956 scope.go:117] "RemoveContainer" containerID="5c670ffe272006dd521ff52eb0d057dc2460b566a0c6187610c6e00b4e6010bb" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.207560 4956 scope.go:117] "RemoveContainer" containerID="a5d699beb83b3a31c0f680edc711efcb80d5075d8032ae393c1cfaf597c0ed52" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.251768 4956 scope.go:117] "RemoveContainer" containerID="9af1cd6069e07e614f66c814eff30be6f4937a9b522bee17ce0535aafb1b020b" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.326204 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_51fad570-b371-4486-9a35-bea145391f8f/thanos-sidecar/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.343220 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_ca474341-5bf4-4fa5-bc46-56d42c3ccffd/setup-container/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.531715 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_ca474341-5bf4-4fa5-bc46-56d42c3ccffd/setup-container/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.616467 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_ca474341-5bf4-4fa5-bc46-56d42c3ccffd/rabbitmq/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.753399 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_28cef006-3a3a-464f-b6d0-9faea75b0a9e/setup-container/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.941763 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_28cef006-3a3a-464f-b6d0-9faea75b0a9e/setup-container/0.log" Mar 14 09:47:59 crc kubenswrapper[4956]: I0314 09:47:59.981016 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_28cef006-3a3a-464f-b6d0-9faea75b0a9e/rabbitmq/0.log" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.157449 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558028-dcwt6"] Mar 14 09:48:00 crc kubenswrapper[4956]: E0314 09:48:00.157870 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864a9dee-9abe-458e-9fc7-151e1df7c41c" containerName="oc" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.157886 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="864a9dee-9abe-458e-9fc7-151e1df7c41c" containerName="oc" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.158086 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="864a9dee-9abe-458e-9fc7-151e1df7c41c" containerName="oc" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.158724 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.163846 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.164116 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.164230 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.166425 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-dcwt6"] Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.270346 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546jh\" (UniqueName: \"kubernetes.io/projected/9435121f-6fad-4634-9595-b998a87504a2-kube-api-access-546jh\") pod \"auto-csr-approver-29558028-dcwt6\" (UID: \"9435121f-6fad-4634-9595-b998a87504a2\") " pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.371421 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-546jh\" (UniqueName: \"kubernetes.io/projected/9435121f-6fad-4634-9595-b998a87504a2-kube-api-access-546jh\") pod \"auto-csr-approver-29558028-dcwt6\" (UID: \"9435121f-6fad-4634-9595-b998a87504a2\") " pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.396542 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-546jh\" (UniqueName: \"kubernetes.io/projected/9435121f-6fad-4634-9595-b998a87504a2-kube-api-access-546jh\") pod \"auto-csr-approver-29558028-dcwt6\" (UID: \"9435121f-6fad-4634-9595-b998a87504a2\") " pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.477633 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:00 crc kubenswrapper[4956]: I0314 09:48:00.993857 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-dcwt6"] Mar 14 09:48:01 crc kubenswrapper[4956]: I0314 09:48:01.271885 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" event={"ID":"9435121f-6fad-4634-9595-b998a87504a2","Type":"ContainerStarted","Data":"874be5342b4a1e3e6e2f89697dd4ad287df98042d07adfd2b68f9cb2236728c9"} Mar 14 09:48:02 crc kubenswrapper[4956]: I0314 09:48:02.279765 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" event={"ID":"9435121f-6fad-4634-9595-b998a87504a2","Type":"ContainerStarted","Data":"cdaf6db3efe07a82a84b7ac8e890b8b9e273f097fd2595526927ab4afaff0e1e"} Mar 14 09:48:02 crc kubenswrapper[4956]: I0314 09:48:02.298393 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" podStartSLOduration=1.47361375 podStartE2EDuration="2.298373471s" podCreationTimestamp="2026-03-14 09:48:00 +0000 UTC" firstStartedPulling="2026-03-14 09:48:01.002697087 +0000 UTC m=+3086.515389365" lastFinishedPulling="2026-03-14 09:48:01.827456818 +0000 UTC m=+3087.340149086" observedRunningTime="2026-03-14 09:48:02.291346014 +0000 UTC m=+3087.804038282" watchObservedRunningTime="2026-03-14 09:48:02.298373471 +0000 UTC m=+3087.811065739" Mar 14 09:48:03 crc kubenswrapper[4956]: I0314 09:48:03.290695 4956 generic.go:334] "Generic (PLEG): container finished" podID="9435121f-6fad-4634-9595-b998a87504a2" containerID="cdaf6db3efe07a82a84b7ac8e890b8b9e273f097fd2595526927ab4afaff0e1e" exitCode=0 Mar 14 09:48:03 crc kubenswrapper[4956]: I0314 09:48:03.290765 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" event={"ID":"9435121f-6fad-4634-9595-b998a87504a2","Type":"ContainerDied","Data":"cdaf6db3efe07a82a84b7ac8e890b8b9e273f097fd2595526927ab4afaff0e1e"} Mar 14 09:48:04 crc kubenswrapper[4956]: I0314 09:48:04.210409 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:48:04 crc kubenswrapper[4956]: E0314 09:48:04.210611 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:48:04 crc kubenswrapper[4956]: I0314 09:48:04.641745 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:04 crc kubenswrapper[4956]: I0314 09:48:04.646944 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-546jh\" (UniqueName: \"kubernetes.io/projected/9435121f-6fad-4634-9595-b998a87504a2-kube-api-access-546jh\") pod \"9435121f-6fad-4634-9595-b998a87504a2\" (UID: \"9435121f-6fad-4634-9595-b998a87504a2\") " Mar 14 09:48:04 crc kubenswrapper[4956]: I0314 09:48:04.654068 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9435121f-6fad-4634-9595-b998a87504a2-kube-api-access-546jh" (OuterVolumeSpecName: "kube-api-access-546jh") pod "9435121f-6fad-4634-9595-b998a87504a2" (UID: "9435121f-6fad-4634-9595-b998a87504a2"). InnerVolumeSpecName "kube-api-access-546jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:48:04 crc kubenswrapper[4956]: I0314 09:48:04.748835 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-546jh\" (UniqueName: \"kubernetes.io/projected/9435121f-6fad-4634-9595-b998a87504a2-kube-api-access-546jh\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:05 crc kubenswrapper[4956]: I0314 09:48:05.305399 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" event={"ID":"9435121f-6fad-4634-9595-b998a87504a2","Type":"ContainerDied","Data":"874be5342b4a1e3e6e2f89697dd4ad287df98042d07adfd2b68f9cb2236728c9"} Mar 14 09:48:05 crc kubenswrapper[4956]: I0314 09:48:05.305447 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="874be5342b4a1e3e6e2f89697dd4ad287df98042d07adfd2b68f9cb2236728c9" Mar 14 09:48:05 crc kubenswrapper[4956]: I0314 09:48:05.305558 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-dcwt6" Mar 14 09:48:05 crc kubenswrapper[4956]: I0314 09:48:05.365180 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-ngvx2"] Mar 14 09:48:05 crc kubenswrapper[4956]: I0314 09:48:05.374036 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-ngvx2"] Mar 14 09:48:07 crc kubenswrapper[4956]: I0314 09:48:07.224459 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8def780b-f037-47da-ab01-83139e1a44b3" path="/var/lib/kubelet/pods/8def780b-f037-47da-ab01-83139e1a44b3/volumes" Mar 14 09:48:09 crc kubenswrapper[4956]: I0314 09:48:09.136469 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_memcached-0_2973e91c-c0d8-4a9c-871e-0147ecc86297/memcached/0.log" Mar 14 09:48:16 crc kubenswrapper[4956]: I0314 09:48:16.705280 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/util/0.log" Mar 14 09:48:16 crc kubenswrapper[4956]: I0314 09:48:16.951426 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/pull/0.log" Mar 14 09:48:16 crc kubenswrapper[4956]: I0314 09:48:16.956667 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/pull/0.log" Mar 14 09:48:16 crc kubenswrapper[4956]: I0314 09:48:16.973516 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/util/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.151861 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/util/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.176512 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/pull/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.178122 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8744sb7r_75426cf5-0280-405c-a216-58ba481acb46/extract/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.209287 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:48:17 crc kubenswrapper[4956]: E0314 09:48:17.209573 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.339644 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/util/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.493394 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/util/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.508677 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/pull/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.536060 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/pull/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.725831 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/extract/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.732881 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/util/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.745883 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1tlsz6_37548b9f-0521-4b58-a42a-a023fe66022b/pull/0.log" Mar 14 09:48:17 crc kubenswrapper[4956]: I0314 09:48:17.931073 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/util/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.112967 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/pull/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.121203 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/util/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.122962 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/pull/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.244987 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/util/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.308393 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/extract/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.319446 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57vqd9_a710d73a-4449-44aa-b6e0-198adf069d08/pull/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.417068 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/util/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.588698 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/util/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.591118 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/pull/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.593534 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/pull/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.773341 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/util/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.797102 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/extract/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.798224 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08spsnl_804193a1-2db0-4ac9-a126-4c79735f8302/pull/0.log" Mar 14 09:48:18 crc kubenswrapper[4956]: I0314 09:48:18.942582 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/extract-utilities/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.110687 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/extract-content/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.117330 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/extract-utilities/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.118817 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/extract-content/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.291886 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/extract-utilities/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.299220 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/extract-content/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.576962 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/extract-utilities/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.743718 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/extract-content/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.751568 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/extract-utilities/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.801148 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzmj4_7e5025eb-1d48-4be9-ab21-5aa5c0436d7b/registry-server/0.log" Mar 14 09:48:19 crc kubenswrapper[4956]: I0314 09:48:19.858864 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/extract-content/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.135787 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/extract-content/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.314810 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/extract-utilities/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.463495 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7gnpp_92ce0359-87bc-46d6-8673-b10febbf0742/marketplace-operator/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.534064 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/extract-utilities/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.812017 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fnwsk_0eb1f1fb-17bc-4cc7-a158-6576b474e996/registry-server/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.837654 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/extract-content/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.873430 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/extract-utilities/0.log" Mar 14 09:48:20 crc kubenswrapper[4956]: I0314 09:48:20.906793 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/extract-content/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.045661 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/extract-content/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.050962 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/extract-utilities/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.153380 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6mtml_098e20c5-7d99-4468-896d-fb13222a450b/registry-server/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.231095 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/extract-utilities/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.440351 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/extract-content/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.447222 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/extract-utilities/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.461205 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/extract-content/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.610736 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/extract-content/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.615611 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/extract-utilities/0.log" Mar 14 09:48:21 crc kubenswrapper[4956]: I0314 09:48:21.990618 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5d92s_e7f24159-2581-4653-be69-3aae1dd7e3f9/registry-server/0.log" Mar 14 09:48:32 crc kubenswrapper[4956]: I0314 09:48:32.209779 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:48:32 crc kubenswrapper[4956]: E0314 09:48:32.210639 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:48:33 crc kubenswrapper[4956]: I0314 09:48:33.702781 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b9c87565f-llzfw_c888df81-af56-4ddd-8857-4952f199f288/prometheus-operator-admission-webhook/0.log" Mar 14 09:48:33 crc kubenswrapper[4956]: I0314 09:48:33.721688 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-259ww_c9a78fe1-8b1a-4655-ad98-19c53622c2b1/prometheus-operator/0.log" Mar 14 09:48:33 crc kubenswrapper[4956]: I0314 09:48:33.722592 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b9c87565f-wt77z_b9e5b9f8-a1eb-4cd9-8200-dbf2cbdb4193/prometheus-operator-admission-webhook/0.log" Mar 14 09:48:34 crc kubenswrapper[4956]: I0314 09:48:34.089398 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-hkwjk_0be4e37c-1708-4556-9ac0-e6daaf8fdadf/observability-ui-dashboards/0.log" Mar 14 09:48:34 crc kubenswrapper[4956]: I0314 09:48:34.104254 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-q6fzl_2dd43b3c-f9be-41a8-b1a6-11ab052283c7/operator/0.log" Mar 14 09:48:34 crc kubenswrapper[4956]: I0314 09:48:34.150843 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2vqmj_50a4be30-e002-4c15-b3ef-b3048665261b/perses-operator/0.log" Mar 14 09:48:45 crc kubenswrapper[4956]: I0314 09:48:45.216057 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:48:45 crc kubenswrapper[4956]: E0314 09:48:45.216725 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:48:56 crc kubenswrapper[4956]: I0314 09:48:56.209283 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:48:56 crc kubenswrapper[4956]: E0314 09:48:56.209991 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:48:59 crc kubenswrapper[4956]: I0314 09:48:59.418853 4956 scope.go:117] "RemoveContainer" containerID="af87f18ef30904aa0e3c3acccabb089142350d73082840a4d380c92b738945e8" Mar 14 09:48:59 crc kubenswrapper[4956]: I0314 09:48:59.480358 4956 scope.go:117] "RemoveContainer" containerID="222a4a8571366504790b1be5a3e7932b0b43e132551191bb7958df6f1a230119" Mar 14 09:49:11 crc kubenswrapper[4956]: I0314 09:49:11.209725 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:49:11 crc kubenswrapper[4956]: E0314 09:49:11.210455 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:49:26 crc kubenswrapper[4956]: I0314 09:49:26.210601 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:49:26 crc kubenswrapper[4956]: E0314 09:49:26.211287 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:49:37 crc kubenswrapper[4956]: I0314 09:49:37.209871 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:49:37 crc kubenswrapper[4956]: E0314 09:49:37.210730 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:49:48 crc kubenswrapper[4956]: I0314 09:49:48.209911 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:49:48 crc kubenswrapper[4956]: E0314 09:49:48.211186 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:49:51 crc kubenswrapper[4956]: I0314 09:49:51.107954 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerID="03fc8db1f82412c77d8985eb52e625b4148ef5617d5690f935be8b4ef51e51a9" exitCode=0 Mar 14 09:49:51 crc kubenswrapper[4956]: I0314 09:49:51.108045 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nc79n/must-gather-lnld2" event={"ID":"7e68c8bc-ec3d-447e-ba56-dda97420dd80","Type":"ContainerDied","Data":"03fc8db1f82412c77d8985eb52e625b4148ef5617d5690f935be8b4ef51e51a9"} Mar 14 09:49:51 crc kubenswrapper[4956]: I0314 09:49:51.108994 4956 scope.go:117] "RemoveContainer" containerID="03fc8db1f82412c77d8985eb52e625b4148ef5617d5690f935be8b4ef51e51a9" Mar 14 09:49:51 crc kubenswrapper[4956]: I0314 09:49:51.425941 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nc79n_must-gather-lnld2_7e68c8bc-ec3d-447e-ba56-dda97420dd80/gather/0.log" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.010408 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nc79n/must-gather-lnld2"] Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.011784 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nc79n/must-gather-lnld2" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="copy" containerID="cri-o://28655a9a0e2fbe10e7aa8c194f7796c43f24b4d359a5cd672a7c1a8f2db09f70" gracePeriod=2 Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.019503 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nc79n/must-gather-lnld2"] Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.175239 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nc79n_must-gather-lnld2_7e68c8bc-ec3d-447e-ba56-dda97420dd80/copy/0.log" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.175657 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerID="28655a9a0e2fbe10e7aa8c194f7796c43f24b4d359a5cd672a7c1a8f2db09f70" exitCode=143 Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.504777 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nc79n_must-gather-lnld2_7e68c8bc-ec3d-447e-ba56-dda97420dd80/copy/0.log" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.505676 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.591706 4956 scope.go:117] "RemoveContainer" containerID="f5ab6b2e2e353ab2544b81611534a1925b470fab2198f521f67b7d6423d4136f" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.598334 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e68c8bc-ec3d-447e-ba56-dda97420dd80-must-gather-output\") pod \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.598576 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2npsq\" (UniqueName: \"kubernetes.io/projected/7e68c8bc-ec3d-447e-ba56-dda97420dd80-kube-api-access-2npsq\") pod \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\" (UID: \"7e68c8bc-ec3d-447e-ba56-dda97420dd80\") " Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.604736 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e68c8bc-ec3d-447e-ba56-dda97420dd80-kube-api-access-2npsq" (OuterVolumeSpecName: "kube-api-access-2npsq") pod "7e68c8bc-ec3d-447e-ba56-dda97420dd80" (UID: "7e68c8bc-ec3d-447e-ba56-dda97420dd80"). InnerVolumeSpecName "kube-api-access-2npsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.610028 4956 scope.go:117] "RemoveContainer" containerID="e70d565c754401c3d75812dd5dd7621be42ac1eed5a8cf608b9bb035994147b9" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.666459 4956 scope.go:117] "RemoveContainer" containerID="f59c119113a1d8209718cbfedd24007c7228978ce5151769e01394556d2d139c" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.687967 4956 scope.go:117] "RemoveContainer" containerID="48a5745b1aa44ba60cd3b7447229aa0b367cd12eac97674caff8f45d9c14a52a" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.700279 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2npsq\" (UniqueName: \"kubernetes.io/projected/7e68c8bc-ec3d-447e-ba56-dda97420dd80-kube-api-access-2npsq\") on node \"crc\" DevicePath \"\"" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.713256 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e68c8bc-ec3d-447e-ba56-dda97420dd80-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7e68c8bc-ec3d-447e-ba56-dda97420dd80" (UID: "7e68c8bc-ec3d-447e-ba56-dda97420dd80"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.726167 4956 scope.go:117] "RemoveContainer" containerID="ac5afc3dbeb9d9b56af193314acd72c0aa0769368fc184897822a199824d8369" Mar 14 09:49:59 crc kubenswrapper[4956]: I0314 09:49:59.801874 4956 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e68c8bc-ec3d-447e-ba56-dda97420dd80-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.167385 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558030-d7hqz"] Mar 14 09:50:00 crc kubenswrapper[4956]: E0314 09:50:00.167871 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="gather" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.167899 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="gather" Mar 14 09:50:00 crc kubenswrapper[4956]: E0314 09:50:00.167918 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9435121f-6fad-4634-9595-b998a87504a2" containerName="oc" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.167926 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9435121f-6fad-4634-9595-b998a87504a2" containerName="oc" Mar 14 09:50:00 crc kubenswrapper[4956]: E0314 09:50:00.167936 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="copy" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.167944 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="copy" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.168128 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="copy" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.168156 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="9435121f-6fad-4634-9595-b998a87504a2" containerName="oc" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.168168 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" containerName="gather" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.168957 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.171392 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.171457 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.172547 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.178302 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-d7hqz"] Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.186675 4956 scope.go:117] "RemoveContainer" containerID="28655a9a0e2fbe10e7aa8c194f7796c43f24b4d359a5cd672a7c1a8f2db09f70" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.186834 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nc79n/must-gather-lnld2" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.207603 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9nx\" (UniqueName: \"kubernetes.io/projected/fa4ab338-ea51-4535-8ab4-4e7f8bf025e4-kube-api-access-pj9nx\") pod \"auto-csr-approver-29558030-d7hqz\" (UID: \"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4\") " pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.210629 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:50:00 crc kubenswrapper[4956]: E0314 09:50:00.210812 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.217954 4956 scope.go:117] "RemoveContainer" containerID="03fc8db1f82412c77d8985eb52e625b4148ef5617d5690f935be8b4ef51e51a9" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.309618 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9nx\" (UniqueName: \"kubernetes.io/projected/fa4ab338-ea51-4535-8ab4-4e7f8bf025e4-kube-api-access-pj9nx\") pod \"auto-csr-approver-29558030-d7hqz\" (UID: \"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4\") " pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.326465 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9nx\" (UniqueName: \"kubernetes.io/projected/fa4ab338-ea51-4535-8ab4-4e7f8bf025e4-kube-api-access-pj9nx\") pod \"auto-csr-approver-29558030-d7hqz\" (UID: \"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4\") " pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.542369 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.937530 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-d7hqz"] Mar 14 09:50:00 crc kubenswrapper[4956]: I0314 09:50:00.944062 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:50:01 crc kubenswrapper[4956]: I0314 09:50:01.197125 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" event={"ID":"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4","Type":"ContainerStarted","Data":"85b58419b17fcbe6a36b2803ab604323da4b572add8c44d46b33895c8fc96522"} Mar 14 09:50:01 crc kubenswrapper[4956]: I0314 09:50:01.219313 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e68c8bc-ec3d-447e-ba56-dda97420dd80" path="/var/lib/kubelet/pods/7e68c8bc-ec3d-447e-ba56-dda97420dd80/volumes" Mar 14 09:50:04 crc kubenswrapper[4956]: I0314 09:50:04.221148 4956 generic.go:334] "Generic (PLEG): container finished" podID="fa4ab338-ea51-4535-8ab4-4e7f8bf025e4" containerID="5eed9a60b46a45450a78f5287068604be499fb52999fb7bf6b2bd7bd57106967" exitCode=0 Mar 14 09:50:04 crc kubenswrapper[4956]: I0314 09:50:04.221563 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" event={"ID":"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4","Type":"ContainerDied","Data":"5eed9a60b46a45450a78f5287068604be499fb52999fb7bf6b2bd7bd57106967"} Mar 14 09:50:05 crc kubenswrapper[4956]: I0314 09:50:05.519656 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:05 crc kubenswrapper[4956]: I0314 09:50:05.586667 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj9nx\" (UniqueName: \"kubernetes.io/projected/fa4ab338-ea51-4535-8ab4-4e7f8bf025e4-kube-api-access-pj9nx\") pod \"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4\" (UID: \"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4\") " Mar 14 09:50:05 crc kubenswrapper[4956]: I0314 09:50:05.592103 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4ab338-ea51-4535-8ab4-4e7f8bf025e4-kube-api-access-pj9nx" (OuterVolumeSpecName: "kube-api-access-pj9nx") pod "fa4ab338-ea51-4535-8ab4-4e7f8bf025e4" (UID: "fa4ab338-ea51-4535-8ab4-4e7f8bf025e4"). InnerVolumeSpecName "kube-api-access-pj9nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:05 crc kubenswrapper[4956]: I0314 09:50:05.688715 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj9nx\" (UniqueName: \"kubernetes.io/projected/fa4ab338-ea51-4535-8ab4-4e7f8bf025e4-kube-api-access-pj9nx\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:06 crc kubenswrapper[4956]: I0314 09:50:06.239210 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" event={"ID":"fa4ab338-ea51-4535-8ab4-4e7f8bf025e4","Type":"ContainerDied","Data":"85b58419b17fcbe6a36b2803ab604323da4b572add8c44d46b33895c8fc96522"} Mar 14 09:50:06 crc kubenswrapper[4956]: I0314 09:50:06.239252 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b58419b17fcbe6a36b2803ab604323da4b572add8c44d46b33895c8fc96522" Mar 14 09:50:06 crc kubenswrapper[4956]: I0314 09:50:06.239311 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-d7hqz" Mar 14 09:50:06 crc kubenswrapper[4956]: I0314 09:50:06.573248 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-slxr8"] Mar 14 09:50:06 crc kubenswrapper[4956]: I0314 09:50:06.580293 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-slxr8"] Mar 14 09:50:07 crc kubenswrapper[4956]: I0314 09:50:07.217617 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876cec90-17ca-4a4c-8cbc-462048c32b1c" path="/var/lib/kubelet/pods/876cec90-17ca-4a4c-8cbc-462048c32b1c/volumes" Mar 14 09:50:15 crc kubenswrapper[4956]: I0314 09:50:15.215527 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:50:15 crc kubenswrapper[4956]: E0314 09:50:15.216222 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.712197 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v95z4"] Mar 14 09:50:18 crc kubenswrapper[4956]: E0314 09:50:18.712870 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4ab338-ea51-4535-8ab4-4e7f8bf025e4" containerName="oc" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.712889 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4ab338-ea51-4535-8ab4-4e7f8bf025e4" containerName="oc" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.713093 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4ab338-ea51-4535-8ab4-4e7f8bf025e4" containerName="oc" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.714238 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.736921 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v95z4"] Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.778951 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-utilities\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.779037 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-catalog-content\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.779142 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf4n4\" (UniqueName: \"kubernetes.io/projected/f6520c5a-bc4e-48e8-a9a1-3738103894a1-kube-api-access-jf4n4\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.881459 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-utilities\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.881783 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-catalog-content\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.881918 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf4n4\" (UniqueName: \"kubernetes.io/projected/f6520c5a-bc4e-48e8-a9a1-3738103894a1-kube-api-access-jf4n4\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.882271 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-utilities\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.882330 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-catalog-content\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:18 crc kubenswrapper[4956]: I0314 09:50:18.921631 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf4n4\" (UniqueName: \"kubernetes.io/projected/f6520c5a-bc4e-48e8-a9a1-3738103894a1-kube-api-access-jf4n4\") pod \"redhat-operators-v95z4\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:19 crc kubenswrapper[4956]: I0314 09:50:19.033082 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:19 crc kubenswrapper[4956]: I0314 09:50:19.440578 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v95z4"] Mar 14 09:50:20 crc kubenswrapper[4956]: I0314 09:50:20.382638 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerID="d723788e75777258ff6fa31cefa9d99e7d289d7206c4a1503fe7f0445654427d" exitCode=0 Mar 14 09:50:20 crc kubenswrapper[4956]: I0314 09:50:20.382870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerDied","Data":"d723788e75777258ff6fa31cefa9d99e7d289d7206c4a1503fe7f0445654427d"} Mar 14 09:50:20 crc kubenswrapper[4956]: I0314 09:50:20.383074 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerStarted","Data":"5a283b1d5f26f68e925bbc8f260cba5d3247c7c006a7926eb5ba528497e00baf"} Mar 14 09:50:22 crc kubenswrapper[4956]: I0314 09:50:22.400501 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerStarted","Data":"367a0ab9ed43c8a1786138d8961dac6d13d6bc251d1c995a7b4d9a81ef0058d2"} Mar 14 09:50:23 crc kubenswrapper[4956]: I0314 09:50:23.410280 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerID="367a0ab9ed43c8a1786138d8961dac6d13d6bc251d1c995a7b4d9a81ef0058d2" exitCode=0 Mar 14 09:50:23 crc kubenswrapper[4956]: I0314 09:50:23.410338 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerDied","Data":"367a0ab9ed43c8a1786138d8961dac6d13d6bc251d1c995a7b4d9a81ef0058d2"} Mar 14 09:50:24 crc kubenswrapper[4956]: I0314 09:50:24.419463 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerStarted","Data":"703a32af6695ae740402b76d7262228cb955d8cf2591cfc8912b5692a9775c0d"} Mar 14 09:50:24 crc kubenswrapper[4956]: I0314 09:50:24.441976 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v95z4" podStartSLOduration=2.774497558 podStartE2EDuration="6.441953037s" podCreationTimestamp="2026-03-14 09:50:18 +0000 UTC" firstStartedPulling="2026-03-14 09:50:20.384306485 +0000 UTC m=+3225.896998753" lastFinishedPulling="2026-03-14 09:50:24.051761964 +0000 UTC m=+3229.564454232" observedRunningTime="2026-03-14 09:50:24.439312931 +0000 UTC m=+3229.952005209" watchObservedRunningTime="2026-03-14 09:50:24.441953037 +0000 UTC m=+3229.954645305" Mar 14 09:50:26 crc kubenswrapper[4956]: I0314 09:50:26.209519 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:50:26 crc kubenswrapper[4956]: E0314 09:50:26.209990 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:50:29 crc kubenswrapper[4956]: I0314 09:50:29.033937 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:29 crc kubenswrapper[4956]: I0314 09:50:29.034298 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:30 crc kubenswrapper[4956]: I0314 09:50:30.074497 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v95z4" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="registry-server" probeResult="failure" output=< Mar 14 09:50:30 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Mar 14 09:50:30 crc kubenswrapper[4956]: > Mar 14 09:50:38 crc kubenswrapper[4956]: I0314 09:50:38.210385 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:50:38 crc kubenswrapper[4956]: E0314 09:50:38.211250 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:50:39 crc kubenswrapper[4956]: I0314 09:50:39.312915 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:39 crc kubenswrapper[4956]: I0314 09:50:39.354932 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:42 crc kubenswrapper[4956]: I0314 09:50:42.899207 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v95z4"] Mar 14 09:50:42 crc kubenswrapper[4956]: I0314 09:50:42.899789 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v95z4" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="registry-server" containerID="cri-o://703a32af6695ae740402b76d7262228cb955d8cf2591cfc8912b5692a9775c0d" gracePeriod=2 Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.555208 4956 generic.go:334] "Generic (PLEG): container finished" podID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerID="703a32af6695ae740402b76d7262228cb955d8cf2591cfc8912b5692a9775c0d" exitCode=0 Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.555283 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerDied","Data":"703a32af6695ae740402b76d7262228cb955d8cf2591cfc8912b5692a9775c0d"} Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.831061 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.933030 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-catalog-content\") pod \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.934221 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-utilities\") pod \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.934326 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf4n4\" (UniqueName: \"kubernetes.io/projected/f6520c5a-bc4e-48e8-a9a1-3738103894a1-kube-api-access-jf4n4\") pod \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\" (UID: \"f6520c5a-bc4e-48e8-a9a1-3738103894a1\") " Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.935426 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-utilities" (OuterVolumeSpecName: "utilities") pod "f6520c5a-bc4e-48e8-a9a1-3738103894a1" (UID: "f6520c5a-bc4e-48e8-a9a1-3738103894a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:43 crc kubenswrapper[4956]: I0314 09:50:43.939402 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6520c5a-bc4e-48e8-a9a1-3738103894a1-kube-api-access-jf4n4" (OuterVolumeSpecName: "kube-api-access-jf4n4") pod "f6520c5a-bc4e-48e8-a9a1-3738103894a1" (UID: "f6520c5a-bc4e-48e8-a9a1-3738103894a1"). InnerVolumeSpecName "kube-api-access-jf4n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.036932 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.037222 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf4n4\" (UniqueName: \"kubernetes.io/projected/f6520c5a-bc4e-48e8-a9a1-3738103894a1-kube-api-access-jf4n4\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.063860 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6520c5a-bc4e-48e8-a9a1-3738103894a1" (UID: "f6520c5a-bc4e-48e8-a9a1-3738103894a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.138170 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6520c5a-bc4e-48e8-a9a1-3738103894a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.567644 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v95z4" event={"ID":"f6520c5a-bc4e-48e8-a9a1-3738103894a1","Type":"ContainerDied","Data":"5a283b1d5f26f68e925bbc8f260cba5d3247c7c006a7926eb5ba528497e00baf"} Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.567698 4956 scope.go:117] "RemoveContainer" containerID="703a32af6695ae740402b76d7262228cb955d8cf2591cfc8912b5692a9775c0d" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.567848 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v95z4" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.599033 4956 scope.go:117] "RemoveContainer" containerID="367a0ab9ed43c8a1786138d8961dac6d13d6bc251d1c995a7b4d9a81ef0058d2" Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.608426 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v95z4"] Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.622809 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v95z4"] Mar 14 09:50:44 crc kubenswrapper[4956]: I0314 09:50:44.642475 4956 scope.go:117] "RemoveContainer" containerID="d723788e75777258ff6fa31cefa9d99e7d289d7206c4a1503fe7f0445654427d" Mar 14 09:50:45 crc kubenswrapper[4956]: I0314 09:50:45.218307 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" path="/var/lib/kubelet/pods/f6520c5a-bc4e-48e8-a9a1-3738103894a1/volumes" Mar 14 09:50:53 crc kubenswrapper[4956]: I0314 09:50:53.209024 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:50:53 crc kubenswrapper[4956]: E0314 09:50:53.210464 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mxjrk_openshift-machine-config-operator(9ba20367-e506-422e-a846-eb1525cb3b94)\"" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" Mar 14 09:50:59 crc kubenswrapper[4956]: I0314 09:50:59.828740 4956 scope.go:117] "RemoveContainer" containerID="88cb6cb2d57cffc62756b9e28f4edd982d6ceac63b0cbd88465100267e440288" Mar 14 09:50:59 crc kubenswrapper[4956]: I0314 09:50:59.849527 4956 scope.go:117] "RemoveContainer" containerID="41b3a68e9782fa6a945a3081a4b8ec44473c1c14b30979c5d702fddcd8ba6913" Mar 14 09:51:08 crc kubenswrapper[4956]: I0314 09:51:08.209089 4956 scope.go:117] "RemoveContainer" containerID="41133f2cff4e193ea896239652a99128d1d7f1185c3837336c95c619d9983634" Mar 14 09:51:08 crc kubenswrapper[4956]: I0314 09:51:08.775326 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" event={"ID":"9ba20367-e506-422e-a846-eb1525cb3b94","Type":"ContainerStarted","Data":"824b2a5065747f5d16e2598a2a59aff3e72ab01b2551d328b0b670560f4098bb"} Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.136579 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558032-4rmz9"] Mar 14 09:52:00 crc kubenswrapper[4956]: E0314 09:52:00.138506 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="extract-content" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.138604 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="extract-content" Mar 14 09:52:00 crc kubenswrapper[4956]: E0314 09:52:00.138689 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="extract-utilities" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.138750 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="extract-utilities" Mar 14 09:52:00 crc kubenswrapper[4956]: E0314 09:52:00.138822 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="registry-server" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.138884 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="registry-server" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.139147 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6520c5a-bc4e-48e8-a9a1-3738103894a1" containerName="registry-server" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.139799 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.141804 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tb6cx" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.141931 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.142270 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.147062 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-4rmz9"] Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.320548 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsst\" (UniqueName: \"kubernetes.io/projected/18b1d5ce-78cd-4080-860d-4ccc5b47780c-kube-api-access-jlsst\") pod \"auto-csr-approver-29558032-4rmz9\" (UID: \"18b1d5ce-78cd-4080-860d-4ccc5b47780c\") " pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.421862 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsst\" (UniqueName: \"kubernetes.io/projected/18b1d5ce-78cd-4080-860d-4ccc5b47780c-kube-api-access-jlsst\") pod \"auto-csr-approver-29558032-4rmz9\" (UID: \"18b1d5ce-78cd-4080-860d-4ccc5b47780c\") " pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.441572 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsst\" (UniqueName: \"kubernetes.io/projected/18b1d5ce-78cd-4080-860d-4ccc5b47780c-kube-api-access-jlsst\") pod \"auto-csr-approver-29558032-4rmz9\" (UID: \"18b1d5ce-78cd-4080-860d-4ccc5b47780c\") " pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.463056 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:00 crc kubenswrapper[4956]: I0314 09:52:00.878764 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-4rmz9"] Mar 14 09:52:01 crc kubenswrapper[4956]: I0314 09:52:01.136318 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" event={"ID":"18b1d5ce-78cd-4080-860d-4ccc5b47780c","Type":"ContainerStarted","Data":"8989cbcd66f84e3dad48ba47b45e862a6889e85419daa5b342688e70a47b28c5"} Mar 14 09:52:02 crc kubenswrapper[4956]: I0314 09:52:02.146474 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" event={"ID":"18b1d5ce-78cd-4080-860d-4ccc5b47780c","Type":"ContainerStarted","Data":"f6bb754c24d5509307ac730a8cf1fedbf6940a017ead199ecab42cdd9c1f4662"} Mar 14 09:52:03 crc kubenswrapper[4956]: I0314 09:52:03.155350 4956 generic.go:334] "Generic (PLEG): container finished" podID="18b1d5ce-78cd-4080-860d-4ccc5b47780c" containerID="f6bb754c24d5509307ac730a8cf1fedbf6940a017ead199ecab42cdd9c1f4662" exitCode=0 Mar 14 09:52:03 crc kubenswrapper[4956]: I0314 09:52:03.155445 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" event={"ID":"18b1d5ce-78cd-4080-860d-4ccc5b47780c","Type":"ContainerDied","Data":"f6bb754c24d5509307ac730a8cf1fedbf6940a017ead199ecab42cdd9c1f4662"} Mar 14 09:52:04 crc kubenswrapper[4956]: I0314 09:52:04.450150 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:04 crc kubenswrapper[4956]: I0314 09:52:04.584583 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlsst\" (UniqueName: \"kubernetes.io/projected/18b1d5ce-78cd-4080-860d-4ccc5b47780c-kube-api-access-jlsst\") pod \"18b1d5ce-78cd-4080-860d-4ccc5b47780c\" (UID: \"18b1d5ce-78cd-4080-860d-4ccc5b47780c\") " Mar 14 09:52:04 crc kubenswrapper[4956]: I0314 09:52:04.591117 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b1d5ce-78cd-4080-860d-4ccc5b47780c-kube-api-access-jlsst" (OuterVolumeSpecName: "kube-api-access-jlsst") pod "18b1d5ce-78cd-4080-860d-4ccc5b47780c" (UID: "18b1d5ce-78cd-4080-860d-4ccc5b47780c"). InnerVolumeSpecName "kube-api-access-jlsst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:04 crc kubenswrapper[4956]: I0314 09:52:04.686510 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlsst\" (UniqueName: \"kubernetes.io/projected/18b1d5ce-78cd-4080-860d-4ccc5b47780c-kube-api-access-jlsst\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:05 crc kubenswrapper[4956]: I0314 09:52:05.170609 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" event={"ID":"18b1d5ce-78cd-4080-860d-4ccc5b47780c","Type":"ContainerDied","Data":"8989cbcd66f84e3dad48ba47b45e862a6889e85419daa5b342688e70a47b28c5"} Mar 14 09:52:05 crc kubenswrapper[4956]: I0314 09:52:05.170665 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8989cbcd66f84e3dad48ba47b45e862a6889e85419daa5b342688e70a47b28c5" Mar 14 09:52:05 crc kubenswrapper[4956]: I0314 09:52:05.170679 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-4rmz9" Mar 14 09:52:05 crc kubenswrapper[4956]: I0314 09:52:05.224915 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-9c6gt"] Mar 14 09:52:05 crc kubenswrapper[4956]: I0314 09:52:05.230768 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-9c6gt"] Mar 14 09:52:07 crc kubenswrapper[4956]: I0314 09:52:07.218435 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864a9dee-9abe-458e-9fc7-151e1df7c41c" path="/var/lib/kubelet/pods/864a9dee-9abe-458e-9fc7-151e1df7c41c/volumes" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.310792 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fvsdp"] Mar 14 09:52:20 crc kubenswrapper[4956]: E0314 09:52:20.311614 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b1d5ce-78cd-4080-860d-4ccc5b47780c" containerName="oc" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.311634 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b1d5ce-78cd-4080-860d-4ccc5b47780c" containerName="oc" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.311835 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b1d5ce-78cd-4080-860d-4ccc5b47780c" containerName="oc" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.313266 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.326471 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvsdp"] Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.352092 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44bqj\" (UniqueName: \"kubernetes.io/projected/77669dd7-b1ab-4f49-8410-e3985bcbd02d-kube-api-access-44bqj\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.352266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-utilities\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.352324 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-catalog-content\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.453404 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-utilities\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.453462 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-catalog-content\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.453535 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44bqj\" (UniqueName: \"kubernetes.io/projected/77669dd7-b1ab-4f49-8410-e3985bcbd02d-kube-api-access-44bqj\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.453932 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-utilities\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.454034 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-catalog-content\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.473654 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44bqj\" (UniqueName: \"kubernetes.io/projected/77669dd7-b1ab-4f49-8410-e3985bcbd02d-kube-api-access-44bqj\") pod \"redhat-marketplace-fvsdp\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:20 crc kubenswrapper[4956]: I0314 09:52:20.636229 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:21 crc kubenswrapper[4956]: I0314 09:52:21.105758 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvsdp"] Mar 14 09:52:21 crc kubenswrapper[4956]: I0314 09:52:21.299715 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvsdp" event={"ID":"77669dd7-b1ab-4f49-8410-e3985bcbd02d","Type":"ContainerStarted","Data":"e76caa67a5d505c0802247393add4e13649dfe575502fb79125adc5b98d8ed95"} Mar 14 09:52:22 crc kubenswrapper[4956]: I0314 09:52:22.308696 4956 generic.go:334] "Generic (PLEG): container finished" podID="77669dd7-b1ab-4f49-8410-e3985bcbd02d" containerID="2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37" exitCode=0 Mar 14 09:52:22 crc kubenswrapper[4956]: I0314 09:52:22.308747 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvsdp" event={"ID":"77669dd7-b1ab-4f49-8410-e3985bcbd02d","Type":"ContainerDied","Data":"2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37"} Mar 14 09:52:23 crc kubenswrapper[4956]: I0314 09:52:23.318942 4956 generic.go:334] "Generic (PLEG): container finished" podID="77669dd7-b1ab-4f49-8410-e3985bcbd02d" containerID="ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419" exitCode=0 Mar 14 09:52:23 crc kubenswrapper[4956]: I0314 09:52:23.319051 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvsdp" event={"ID":"77669dd7-b1ab-4f49-8410-e3985bcbd02d","Type":"ContainerDied","Data":"ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419"} Mar 14 09:52:24 crc kubenswrapper[4956]: I0314 09:52:24.329396 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvsdp" event={"ID":"77669dd7-b1ab-4f49-8410-e3985bcbd02d","Type":"ContainerStarted","Data":"986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6"} Mar 14 09:52:24 crc kubenswrapper[4956]: I0314 09:52:24.350498 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fvsdp" podStartSLOduration=2.931964774 podStartE2EDuration="4.350457727s" podCreationTimestamp="2026-03-14 09:52:20 +0000 UTC" firstStartedPulling="2026-03-14 09:52:22.3102743 +0000 UTC m=+3347.822966568" lastFinishedPulling="2026-03-14 09:52:23.728767253 +0000 UTC m=+3349.241459521" observedRunningTime="2026-03-14 09:52:24.344332723 +0000 UTC m=+3349.857025001" watchObservedRunningTime="2026-03-14 09:52:24.350457727 +0000 UTC m=+3349.863150005" Mar 14 09:52:30 crc kubenswrapper[4956]: I0314 09:52:30.636725 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:30 crc kubenswrapper[4956]: I0314 09:52:30.637203 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:30 crc kubenswrapper[4956]: I0314 09:52:30.676055 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:31 crc kubenswrapper[4956]: I0314 09:52:31.422284 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:34 crc kubenswrapper[4956]: I0314 09:52:34.299591 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvsdp"] Mar 14 09:52:34 crc kubenswrapper[4956]: I0314 09:52:34.300184 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fvsdp" podUID="77669dd7-b1ab-4f49-8410-e3985bcbd02d" containerName="registry-server" containerID="cri-o://986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6" gracePeriod=2 Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.328283 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.399886 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-utilities\") pod \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.400546 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-catalog-content\") pod \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.400660 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-utilities" (OuterVolumeSpecName: "utilities") pod "77669dd7-b1ab-4f49-8410-e3985bcbd02d" (UID: "77669dd7-b1ab-4f49-8410-e3985bcbd02d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.400952 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44bqj\" (UniqueName: \"kubernetes.io/projected/77669dd7-b1ab-4f49-8410-e3985bcbd02d-kube-api-access-44bqj\") pod \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\" (UID: \"77669dd7-b1ab-4f49-8410-e3985bcbd02d\") " Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.402073 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.406651 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77669dd7-b1ab-4f49-8410-e3985bcbd02d-kube-api-access-44bqj" (OuterVolumeSpecName: "kube-api-access-44bqj") pod "77669dd7-b1ab-4f49-8410-e3985bcbd02d" (UID: "77669dd7-b1ab-4f49-8410-e3985bcbd02d"). InnerVolumeSpecName "kube-api-access-44bqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.412815 4956 generic.go:334] "Generic (PLEG): container finished" podID="77669dd7-b1ab-4f49-8410-e3985bcbd02d" containerID="986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6" exitCode=0 Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.412865 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvsdp" event={"ID":"77669dd7-b1ab-4f49-8410-e3985bcbd02d","Type":"ContainerDied","Data":"986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6"} Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.412898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvsdp" event={"ID":"77669dd7-b1ab-4f49-8410-e3985bcbd02d","Type":"ContainerDied","Data":"e76caa67a5d505c0802247393add4e13649dfe575502fb79125adc5b98d8ed95"} Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.412919 4956 scope.go:117] "RemoveContainer" containerID="986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.413072 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvsdp" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.434404 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77669dd7-b1ab-4f49-8410-e3985bcbd02d" (UID: "77669dd7-b1ab-4f49-8410-e3985bcbd02d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.449575 4956 scope.go:117] "RemoveContainer" containerID="ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.469872 4956 scope.go:117] "RemoveContainer" containerID="2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.503661 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44bqj\" (UniqueName: \"kubernetes.io/projected/77669dd7-b1ab-4f49-8410-e3985bcbd02d-kube-api-access-44bqj\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.503708 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77669dd7-b1ab-4f49-8410-e3985bcbd02d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.508691 4956 scope.go:117] "RemoveContainer" containerID="986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6" Mar 14 09:52:35 crc kubenswrapper[4956]: E0314 09:52:35.509189 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6\": container with ID starting with 986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6 not found: ID does not exist" containerID="986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.509219 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6"} err="failed to get container status \"986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6\": rpc error: code = NotFound desc = could not find container \"986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6\": container with ID starting with 986226e140cc2f7bbdc351585f8b9b008d17d7eb81528c9b2645209fdc7d4cf6 not found: ID does not exist" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.509241 4956 scope.go:117] "RemoveContainer" containerID="ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419" Mar 14 09:52:35 crc kubenswrapper[4956]: E0314 09:52:35.509606 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419\": container with ID starting with ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419 not found: ID does not exist" containerID="ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.509627 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419"} err="failed to get container status \"ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419\": rpc error: code = NotFound desc = could not find container \"ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419\": container with ID starting with ef003ab02120d22e363295b4aa78353553cc97d423213c77ce3ae10ac2631419 not found: ID does not exist" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.509645 4956 scope.go:117] "RemoveContainer" containerID="2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37" Mar 14 09:52:35 crc kubenswrapper[4956]: E0314 09:52:35.509892 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37\": container with ID starting with 2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37 not found: ID does not exist" containerID="2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.509915 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37"} err="failed to get container status \"2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37\": rpc error: code = NotFound desc = could not find container \"2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37\": container with ID starting with 2dfee2db4ce70595571e03ec8cac3cde58ce7acef1f7fc47bdea918931362a37 not found: ID does not exist" Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.747288 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvsdp"] Mar 14 09:52:35 crc kubenswrapper[4956]: I0314 09:52:35.755152 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvsdp"] Mar 14 09:52:37 crc kubenswrapper[4956]: I0314 09:52:37.218660 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77669dd7-b1ab-4f49-8410-e3985bcbd02d" path="/var/lib/kubelet/pods/77669dd7-b1ab-4f49-8410-e3985bcbd02d/volumes" Mar 14 09:52:39 crc kubenswrapper[4956]: I0314 09:52:39.837670 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-bf6b7fd8c-56k2s" podUID="401a3c6d-db2a-435b-b7f5-08816736d895" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.86:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:52:59 crc kubenswrapper[4956]: I0314 09:52:59.957252 4956 scope.go:117] "RemoveContainer" containerID="632f5f42be523f6a37ba490f4b0f04516ae5ca9deead2e599ce669aa8e969ca4" Mar 14 09:53:25 crc kubenswrapper[4956]: I0314 09:53:25.423885 4956 patch_prober.go:28] interesting pod/machine-config-daemon-mxjrk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:53:25 crc kubenswrapper[4956]: I0314 09:53:25.424509 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mxjrk" podUID="9ba20367-e506-422e-a846-eb1525cb3b94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"